A Function Space View of Bounded Norm Infinite Width ReLU Nets: The Multivariate Case

Greg Ongie, Rebecca Willett, Daniel Soudry, Nathan Srebro

Keywords: inductive bias, regularization, relu networks

Abstract: We give a tight characterization of the (vectorized Euclidean) norm of weights required to realize a function $f:\mathbb{R}\rightarrow \mathbb{R}^d$ as a single hidden-layer ReLU network with an unbounded number of units (infinite width), extending the univariate characterization of Savarese et al. (2019) to the multivariate case.

Similar Papers

Difference-Seeking Generative Adversarial Network--Unseen Sample Generation
Yi Lin Sung, Sung-Hsien Hsieh, Soo-Chang Pei, Chun-Shien Lu,