Towards Understanding Generalization via Decomposing Excess Risk Dynamics

Jiaye Teng · Jianhao Ma · Yang Yuan

Keywords: [ generalization ] [ dynamics ] [ Excess risk ] [ stability ]

[ Abstract ]
[ Visit Poster at Spot F3 in Virtual World ] [ OpenReview
Tue 26 Apr 6:30 p.m. PDT — 8:30 p.m. PDT


Generalization is one of the fundamental issues in machine learning. However, traditional techniques like uniform convergence may be unable to explain generalization under overparameterization \citep{nagarajan2019uniform}. As alternative approaches, techniques based on stability analyze the training dynamics and derive algorithm-dependent generalization bounds. Unfortunately, the stability-based bounds are still far from explaining the surprising generalization in deep learning since neural networks usually suffer from unsatisfactory stability. This paper proposes a novel decomposition framework to improve the stability-based bounds via a more fine-grained analysis of the signal and noise, inspired by the observation that neural networks converge relatively slowly when fitting noise (which indicates better stability). Concretely, we decompose the excess risk dynamics and apply the stability-based bound only on the noise component. The decomposition framework performs well in both linear regimes (overparameterized linear regression) and non-linear regimes (diagonal matrix recovery). Experiments on neural networks verify the utility of the decomposition framework.

Chat is not available.