Hidden Parameter Recurrent State Space Models For Changing Dynamics Scenarios

Vaisakh Shaj · Dieter B├╝chler · Rohit Sonker · Philipp Becker · Gerhard Neumann

Keywords: [ Multi Task Learning ] [ state space models ] [ recurrent neural networks ]

[ Abstract ]
[ Visit Poster at Spot G3 in Virtual World ] [ OpenReview
Wed 27 Apr 10:30 a.m. PDT — 12:30 p.m. PDT


Recurrent State-space models (RSSMs) are highly expressive models for learning patterns in time series data and for system identification. However, these models are often based on the assumption that the dynamics are fixed and unchanging, which is rarely the case in real-world scenarios. Many control applications often exhibit tasks with similar, but not identical dynamics, that can be modelled as having a common latent structure. We introduce the Hidden Parameter Recurrent State Space Models (HiP-RSSMs), a framework that parametrizes a family of related state-space models with a low-dimensional set of latent factors. We present a simple and effective way of performing learning and inference over this Gaussian graphical model that avoids approximations like variational inference. We show that HiP-RSSMs outperforms RSSMs and competing multi-task models on several challenging robotic benchmarks both on real systems and simulations.

Chat is not available.