Skip to yearly menu bar Skip to main content


Virtual presentation / poster accept

Incremental Learning of Structured Memory via Closed-Loop Transcription

Shengbang Tong · Xili Dai · Ziyang Wu · Mingyang Li · Brent Yi · Yi Ma

Keywords: [ Generative Replay Incremental Learning ] [ Closed Loop Transcription ] [ Neuroscience and Cognitive Science ]


Abstract:

This work proposes a minimal computational model for learning structured memories of multiple object classes in an incremental setting. Our approach is based on establishing a {\em closed-loop transcription} between the classes and a corresponding set of subspaces, known as a linear discriminative representation, in a low-dimensional feature space. Our method is simpler than existing approaches for incremental learning, and more efficient in terms of model size, storage, and computation: it requires only a single, fixed-capacity autoencoding network with a feature space that is used for both discriminative and generative purposes. Network parameters are optimized simultaneously without architectural manipulations, by solving a constrained minimax game between the encoding and decoding maps over a single rate reduction-based objective. Experimental results show that our method can effectively alleviate catastrophic forgetting, achieving significantly better performance than prior work of generative replay on MNIST, CIFAR-10, and ImageNet-50, despite requiring fewer resources.

Chat is not available.