Skip to yearly menu bar Skip to main content


Poster

A Riemannian Framework for Learning Reduced-order Lagrangian Dynamics

Katharina Friedl · Noémie Jaquier · Jens Lundell · Tamim Asfour · Danica Kragic

Hall 3 + Hall 2B #291
[ ]
Sat 26 Apr midnight PDT — 2:30 a.m. PDT

Abstract:

By incorporating physical consistency as inductive bias, deep neural networks display increased generalization capabilities and data efficiency in learning nonlinear dynamic models. However, the complexity of these models generally increases with the system dimensionality, requiring larger datasets, more complex deep networks, and significant computational effort.We propose a novel geometric network architecture to learn physically-consistent reduced-order dynamic parameters that accurately describe the original high-dimensional system behavior.This is achieved by building on recent advances in model-order reduction and by adopting a Riemannian perspective to jointly learn a non-linear structure-preserving latent space and the associated low-dimensional dynamics.Our approach enables accurate long-term predictions of the high-dimensional dynamics of rigid and deformable systems with increased data efficiency by inferring interpretable and physically-plausible reduced Lagrangian models.

Live content is unavailable. Log in and register to view live content