Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI4DifferentialEquations In Science

Latent Diffusion Transformer with Local Neural Field as PDE Surrogate Model

Louis Serrano · Jean-Noël Vittaut · patrick Gallinari


Abstract:

We introduce a diffusion transformer architecture, AROMA (Attentive Reduced Order Model with Attention), for dynamics modeling of complex systems. By employing a discretization-free encoder and a local neural field decoder, we construct a latent space that accurately captures spatiality without requiring traditional space discretization. The diffusion transformer models the dynamics in this latent space conditioned on the previous state. It refines the predictions providing enhanced stability compared to traditional transformers and then enabling longer rollouts. AROMA demonstrates superior performance over existing neural field methods in simulating 1D and 2D equations, highlighting the effectiveness of our approach in capturing complex dynamical behaviors.

Chat is not available.