Skip to yearly menu bar Skip to main content


Poster

Learning Awareness Models

Brandon Amos · Laurent Dinh · Serkan Cabi · Thomas Rothörl · Sergio Gómez Colmenarejo · Alistair Muldal · Tom Erez · Yuval Tassa · Nando de Freitas · Misha Denil

East Meeting level; 1,2,3 #14

Abstract:

We consider the setting of an agent with a fixed body interacting with an unknown and uncertain external world. We show that models trained to predict proprioceptive information about the agent's body come to represent objects in the external world. In spite of being trained with only internally available signals, these dynamic body models come to represent external objects through the necessity of predicting their effects on the agent's own body. That is, the model learns holistic persistent representations of objects in the world, even though the only training signals are body signals. Our dynamics model is able to successfully predict distributions over 132 sensor readings over 100 steps into the future and we demonstrate that even when the body is no longer in contact with an object, the latent variables of the dynamics model continue to represent its shape. We show that active data collection by maximizing the entropy of predictions about the body---touch sensors, proprioception and vestibular information---leads to learning of dynamic models that show superior performance when used for control. We also collect data from a real robotic hand and show that the same models can be used to answer questions about properties of objects in the real world. Videos with qualitative results of our models are available at https://goo.gl/mZuqAV.

Live content is unavailable. Log in and register to view live content