Skip to yearly menu bar Skip to main content


Poster

Overcoming Catastrophic Forgetting for Continual Learning via Model Adaptation

Wenpeng Hu · Zhou Lin · Bing Liu · Chongyang Tao · Jay Tao · Jinwen Ma · Dongyan Zhao · Rui Yan

Great Hall BC #85

Keywords: [ continual learning ] [ overcoming forgetting ] [ model adaptation ]


Abstract:

Learning multiple tasks sequentially is important for the development of AI and lifelong learning systems. However, standard neural network architectures suffer from catastrophic forgetting which makes it difficult for them to learn a sequence of tasks. Several continual learning methods have been proposed to address the problem. In this paper, we propose a very different approach, called Parameter Generation and Model Adaptation (PGMA), to dealing with the problem. The proposed approach learns to build a model, called the solver, with two sets of parameters. The first set is shared by all tasks learned so far and the second set is dynamically generated to adapt the solver to suit each test example in order to classify it. Extensive experiments have been carried out to demonstrate the effectiveness of the proposed approach.

Live content is unavailable. Log in and register to view live content