ICLR 2018
Skip to yearly menu bar Skip to main content


Workshop

Learning to Learn Without Labels

Luke Metz · Niru Maheswaranathan · Brian Cheung

East Meeting Level 8 + 15 #4

A major goal of unsupervised learning is for algorithms to learn representations of data, useful for subsequent tasks, without access to supervised labels or other high-level attributes. Typically, these algorithms minimize a surrogate objective, such as reconstruction error or likelihood of a generative model, with the hope that representations useful for subsequent tasks will arise as a side effect (e.g. semi-supervised classification). In this work, we propose using meta-learning to learn an unsupervised learning rule, and meta-optimize the learning rule directly to produce good representations for a desired task. Here, our desired task (meta-objective) is the performance of the representation on semi-supervised classification, and we meta-learn an algorithm -- an unsupervised weight update rule -- that produces representations that perform well under this meta-objective. We examine the performance of the learned algorithm on several datasets and show that it learns useful features, generalizes across both network architectures and a wide array of datasets, and outperforms existing unsupervised learning techniques.

Live content is unavailable. Log in and register to view live content