Skip to yearly menu bar Skip to main content


Poster

Understanding Optimization in Deep Learning with Central Flows

Jeremy Cohen · Alex Damian · Ameet Talwalkar · Zico Kolter · Jason Lee

Hall 3 + Hall 2B #624
[ ]
Thu 24 Apr midnight PDT — 2:30 a.m. PDT

Abstract:

Optimization in deep learning remains poorly understood. A key difficulty is that optimizers exhibit complex oscillatory dynamics, referred to as "edge of stability," which cannot be captured by traditional optimization theory. In this paper, we show that the path taken by an oscillatory optimizer can often be captured by a central flow: a differential equation which directly models the time-averaged (i.e. smoothed) optimization trajectory. We empirically show that these central flows can predict long-term optimization trajectories for generic neural networks with a high degree of numerical accuracy. By interpreting these flows, we are able to understand how gradient descent makes progress even as the loss sometimes goes up; how adaptive optimizers adapt'' to the local loss landscape; and how adaptive optimizers implicitly seek out regions of weight space where they can take larger steps. These insights (and others) are not apparent from the optimizers' update rules, but are revealed by the central flows. Therefore, we believe that central flows constitute a promising tool for reasoning about optimization in deep learning.

Live content is unavailable. Log in and register to view live content