Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI4DifferentialEquations In Science

Neural Context Flows for Learning Generalizable Dynamical Systems

Roussel Desmond Nzoyem · David Barton · Tom Deakin


Abstract:

Neural Ordinary Differential Equations typically struggle to generalize to new dynamical behaviors created by parameter changes in the underlying system, even when the dynamics are close to previously seen behaviors. The issue gets worse when the changing parameters are unobserved, i.e., their value or influence is not directly measurable when collecting data. We introduce Neural Context Flow (NCF), a framework that encodes said unobserved parameters in a latent context vector as input to a vector field. NCFs leverage differentiability of the vector field with respect to the parameters, along with first-order Taylor expansion to allow any context vector to influence trajectories from other parameters. We validate our method and compare it to established Multi-Task and Meta Learning alternatives, showing an improvement of 5\% in mean squared error for in-domain evaluation, and 35\% for out-of-distribution evaluation on the Lotka-Volterra forecasting problem, thus establishing a new state-of-the-art. This study holds practical implications for foundational models in science and all other areas that benefit from conditional neural ODEs. Our code is openly available at \url{AnonymousGithubRepo}.

Chat is not available.