Integration of Deep Neural Models and Differential Equations

Tan M Nguyen · Richard Baraniuk · Animesh Garg · Stanley J Osher · Anima Anandkumar · Bao Wang

Detailed schedule of our workshop can be found in the Google Sheet below:
Detailed Schedule
We are going to hold a panel discussion on how researchers from applied math and machine learning communities can join forces to solve challenging problems in both fields. You can participate in our panel discussion by entering your questions into the following GDOC:
Questions for Panelists
All accepted papers are posted on our OpenReview site:
Accepted Papers

Description: Differential equations and neural networks are not only closely related to each other but also offer complementary strengths: the modelling power and interpretability of differential equations, and the approximation and generalization power of deep neural networks. The great leap forward in machine learning empowered by deep neural networks has been primarily relying on the increasing amounts of data coupled with modern abstractions of distributed computing. When the models and problems grow larger and more complex, the need for ever larger datasets becomes a bottleneck.

Differential equations have been the principled way to encode prior structural assumptions into nonlinear models such as deep neural networks, reducing their need for data while maintaining the modelling power. These advantages allow the models to scale up to bigger problems with better robustness and safety guarantee in practical settings.

While progress has been made on combining differential equations and deep neural networks, most existing work has been disjointed, and a coherent picture has yet to emerge. Substantive progress will require a principled approach that integrates ideas from the disparate lens, including differential equations, machine learning, numerical analysis, optimization, and physics.

The goal of this workshop is to provide a forum where theoretical and experimental researchers of all stripes can come together not only to share reports on their progress but also to find new ways to join forces towards the goal of coherent integration of deep neural networks and differential equations. Topics to be discussed include, but are not limited to:
- Deep learning for high dimensional PDE problems
- PDE and stochastic analysis for deep learning
- PDE and analysis for new architectures
- Differential equations interpretations of first order optimization methods
- Inverse problems approaches to learning theory
- Numerical tools to interface deep learning models and ODE/PDE solver