Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

Tuning Frequency Bias in Neural Network Training with Nonuniform Data

Annan Yu · Yunan Yang · Alex Townsend

MH1-2-3-4 #157

Keywords: [ Theory ] [ neural tangent kernel ] [ Sobolev norms ] [ nonuniform ] [ frequency bias ] [ training ] [ neural networks ]


Abstract:

Small generalization errors of over-parameterized neural networks (NNs) can be partially explained by the frequency biasing phenomenon, where gradient-based algorithms minimize the low-frequency misfit before reducing the high-frequency residuals. Using the Neural Tangent Kernel (NTK), one can provide a theoretically rigorous analysis for training where data are drawn from constant or piecewise-constant probability densities. Since most training data sets are not drawn from such distributions, we use the NTK model and a data-dependent quadrature rule to theoretically quantify the frequency biasing of NN training given fully nonuniform data. By replacing the loss function with a carefully selected Sobolev norm, we can further amplify, dampen, counterbalance, or reverse the intrinsic frequency biasing in NN training.

Chat is not available.