Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

On the Soft-Subnetwork for Few-Shot Class Incremental Learning

Haeyong Kang · Jaehong Yoon · Sultan Madjid · Sung Ju Hwang · Chang Yoo

MH1-2-3-4 #81

Keywords: [ Soft-subnetwork ] [ Few-shot class incremental learning (FSCIL) ] [ Deep Learning and representational learning ]


Abstract:

Inspired by Regularized Lottery Ticket Hypothesis, which states that competitive smooth (non-binary) subnetworks exist within a dense network, we propose a few-shot class-incremental learning method referred to as Soft-SubNetworks (SoftNet). Our objective is to learn a sequence of sessions incrementally, where each session only includes a few training instances per class while preserving the knowledge of the previously learned ones. SoftNet jointly learns the model weights and adaptive non-binary soft masks at a base training session in which each mask consists of the major and minor subnetwork; the former aims to minimize catastrophic forgetting during training, and the latter aims to avoid overfitting to a few samples in each new training session. We provide comprehensive empirical validations demonstrating that our SoftNet effectively tackles the few-shot incremental learning problem by surpassing the performance of state-of-the-art baselines over benchmark datasets.

Chat is not available.