Skip to yearly menu bar Skip to main content


Poster

A primer on analytical learning dynamics of nonlinear neural networks

Rodrigo Carrasco-Davis · Erin Grant

Hall 3 + Hall 2B #309
[ ]
Sat 26 Apr midnight PDT — 2:30 a.m. PDT

Abstract:

The learning dynamics of neural networks—in particular, how parameters change over time during training—describe how data, architecture, and algorithm interact in time to produce a trained neural network model. Characterizing these dynamics, in general, remains an open problem in machine learning, but, handily, restricting the setting allows careful empirical studies and even analytical results. In this blog post, we review approaches to analyzing the learning dynamics of nonlinear neural networks, focusing on a particular setting known as teacher-student that permits an explicit analytical expression for the generalization error of a nonlinear neural network trained with online gradient descent. We provide an accessible mathematical formulation of this analysis and a JAX codebase to implement simulation of the analytical system of ordinary differential equations alongside neural network training in this setting. We conclude with a discussion of how this analytical paradigm has been used to investigate generalization in neural networks and beyond.

Live content is unavailable. Log in and register to view live content