Poster
in
Workshop: AI4DifferentialEquations In Science
Extending Deep Learning Emulation Across Parameter Regimes to Assess Stochastically Driven Spontaneous Transition Events
Ira Shokar · Peter Haynes · Rich Kerswell
Given the computational expense associated with simultaneous multi-task learning, we leverage fine-tuning to generalise a transformer-based network emulating a stochastic dynamical system across a range of parameters. We observe a 40-fold saving in training size required by fine-turning a pre-trained network across a parameter range rather than ab initio training for each new parameter. This allows for rapid adaptation of the deep learning model, that can be used subsequently across a large range of the parameter space or tailored to a specific regime of study. We demonstrate the model's ability to capture the relevant behaviour even at interpolated parameter values not seen during training. Applied to a well-researched zonal jet system, the speed-up provided by the deep learning model over numerical integration makes statistical study of rare events in the physical system computationally feasible.