Skip to yearly menu bar Skip to main content


Poster

Learning to Learn with Conditional Class Dependencies

Xiang Jiang · Seyed Mohammad Havaei · Farshid Varno · Gabriel Chartrand · Nicolas Chapados · Stan Matwin

Great Hall BC #79

Keywords: [ learning to learn ] [ few-shot learning ] [ meta-learning ]


Abstract:

Neural networks can learn to extract statistical properties from data, but they seldom make use of structured information from the label space to help representation learning. Although some label structure can implicitly be obtained when training on huge amounts of data, in a few-shot learning context where little data is available, making explicit use of the label structure can inform the model to reshape the representation space to reflect a global sense of class dependencies. We propose a meta-learning framework, Conditional class-Aware Meta-Learning (CAML), that conditionally transforms feature representations based on a metric space that is trained to capture inter-class dependencies. This enables a conditional modulation of the feature representations of the base-learner to impose regularities informed by the label space. Experiments show that the conditional transformation in CAML leads to more disentangled representations and achieves competitive results on the miniImageNet benchmark.

Live content is unavailable. Log in and register to view live content