Wandering within a world: Online contextualized few-shot learning

Mengye Ren · Michael L Iuzzolino · Michael Mozer · Richard Zemel

Keywords: [ few-shot learning ] [ continual learning ] [ lifelong learning ]

[ Abstract ]
[ Paper ]
Wed 5 May 5 p.m. PDT — 7 p.m. PDT


We aim to bridge the gap between typical human and machine-learning environments by extending the standard framework of few-shot learning to an online, continual setting. In this setting, episodes do not have separate training and testing phases, and instead models are evaluated online while learning novel classes. As in the real world, where the presence of spatiotemporal context helps us retrieve learned skills in the past, our online few-shot learning setting also features an underlying context that changes throughout time. Object classes are correlated within a context and inferring the correct context can lead to better performance. Building upon this setting, we propose a new few-shot learning dataset based on large scale indoor imagery that mimics the visual experience of an agent wandering within a world. Furthermore, we convert popular few-shot learning approaches into online versions and we also propose a new model that can make use of spatiotemporal contextual information from the recent past.

Chat is not available.