We present preliminary results on extending Model-Agnostic Meta-Learning (MAML) (Finn et al., 2017a) to fast adaptation to new classification tasks in the presence of unlabeled data. Using synthetic data, we show that MAML can adapt to new tasks without any labeled examples (unsupervised adaptation) when the new task has the same output space (classes) as the training tasks do. We further extend MAML to the semi-supervised few-shot learning scenario, when the output space of the new tasks can be different from the training tasks.
Live content is unavailable. Log in and register to view live content