Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI4DifferentialEquations In Science

Consistency Matters: Neural ODE Parameters are Dependent on the Training Numerical Method

C. Coelho · M.Fernanda Costa · Luís Ferrás


Abstract:

Neural Ordinary Differential Equations (Neural ODEs) are continuous-depth models that use an ordinary differential equation (ODE) to capture the dynamics of data. Due to their modelling capabilities several works on applications and novel architectures using Neural ODEs can be found in the literature. In this work, we call for the attention to the need of using the same numerical method for both training and making predictions with Neural ODEs since the numerical method employed influences the prediction process, thereby impacting the loss function and introducing variance into parameter optimisation. We provide theoretical insights into how numerical methods of varying orders or with different step sizes influence the loss function of the network. To validate our theoretical analysis, we conduct a series of simple preliminary numerical experiments employing a regression task, demonstrating how the training numerical method influences model performance for testing. Our findings underscore the need for consistency in numerical methods for training and prediction, a consideration not previously emphasised or documented in the literature.

Chat is not available.