Skip to yearly menu bar Skip to main content


Virtual presentation / poster accept

Optimal Activation Functions for the Random Features Regression Model

Jianxin Wang · José Bento

Keywords: [ Theory ] [ Learning theory for neural networks ] [ Random Features Regression Model ] [ Functional analysis and variational calculus ]


Abstract:

The asymptotic mean squared test error and sensitivity of the Random Features Regression model (RFR) have been recently studied. We build on this work and identify in closed-form the family of Activation Functions (AFs) that minimize a combination of the test error and sensitivity of the RFR under different notions of functional parsimony. We find scenarios under which the optimal AFs are linear, saturated linear functions, or expressible in terms of Hermite polynomials. Finally, we show how using optimal AFs impacts well established properties of the RFR model, such as its double descent curve, and the dependency of its optimal regularization parameter on the observation noise level.

Chat is not available.