Poster
in
Workshop: AI4DifferentialEquations In Science
Zebra: a continuous generative transformer for solving parametric PDEs
Louis Serrano · Pierre ERBACHER · Jean-Noël Vittaut · patrick Gallinari
Foundation models have revolutionized deep learning, moving beyond task-specific architectures to versatile models pre-trained using self-supervised learning on extensive datasets. These models have set new benchmarks across domains, including natural language processing, computer vision, and biology, due to their adaptability and state-of-the-art performance on downstream tasks. Yet, for solving PDEs or modeling physical dynamics, the potential of foundation models remains untapped due to the limited scale of existing datasets. This study presents Zebra, a novel generative model that adapts language model techniques to the continuous domain of PDE solutions. Pre-trained on specific PDE families, Zebra excels in dynamic forecasting, surpassing existing neural operators and solvers, and establishes a promising path for foundation models extensively pre-trained on varied PDE scenarios to tackle PDE challenges with scarce data.