DeNOTS: Stable Deep Neural ODEs for Time Series
Ilya Kuleshov ⋅ Evgenia Romanenkova ⋅ Vladislav Zhuzhel ⋅ Galina Boeva ⋅ Evgeni Vorsin ⋅ Alexey Zaytsev
Abstract
Neural Controlled Differential Equations (Neural CDEs) provide a principled framework for modelling irregular time series in continuous time. Their number of function evaluations (NFEs) acts as a natural analogue of depth in discrete neural networks and is typically controlled indirectly via solver tolerances. However, tightening tolerances increases numerical precision without necessarily improving expressiveness. We propose a simple alternative: scaling the integration time horizon to increase NFEs and thereby "deepen" the model. Since enlarging the interval can cause uncontrolled growth in standard vector fields, we introduce a Negative Feedback (NF) mechanism that ensures provable stability without limiting flexibility. We further establish general risk bounds for Neural CDEs and quantify discretization error using Gaussian process theory, improving robustness to integration and interpolation error. On four public benchmarks, our method, **DeNOTS**, outperforms existing approaches—including Neural RDEs and state space models—by up to $20$%. DeNOTS combines expressiveness, stability, and robustness for reliable continuous-time modelling.
Successful Page Load