Skip to yearly menu bar Skip to main content


An Agnostic View on the Cost of Overfitting in (Kernel) Ridge Regression

Lijia Zhou · James Simon · Gal Vardi · Nathan Srebro

Halle B #303
[ ]
Wed 8 May 1:45 a.m. PDT — 3:45 a.m. PDT


We study the cost of overfitting in noisy kernel ridge regression (KRR), which we define as the ratio between the test error of the interpolating ridgeless model and the test error of the optimally-tuned model. We take an ``agnostic'' view in the following sense: we consider the cost as a function of sample size for any target function, even if the sample size is not large enough for consistency or the target is outside the RKHS. We analyze the cost of overfitting under a Gaussian universality ansatz using recently derived (non-rigorous) risk estimates in terms of the task eigenstructure. Our analysis provides a more refined characterization of benign, tempered and catastrophic overfitting (cf. Mallinar et al. 2022).

Chat is not available.