Skip to yearly menu bar Skip to main content


Poster

Online Continual Learning on Class Incremental Blurry Task Configuration with Anytime Inference

Hyunseo Koh · Dahyun Kim · Jung-Woo Ha · Jonghyun Choi

Keywords: [ continual learning ]


Abstract:

Despite rapid advances in continual learning, a large body of research is devoted to improving performance in the existing setups.While a handful of work do propose new continual learning setups, they still lack practicality in certain aspects.For better practicality, we first propose a novel continual learning setup that is online, task-free, class-incremental, of blurry task boundaries and subject to inference queries at any moment.We additionally propose a new metric to better measure the performance of the continual learning methods subject to inference queries at any moment.To address the challenging setup and evaluation protocol, we propose an effective method that employs a new memory management scheme and novel learning techniques.Our empirical validation demonstrates that the proposed method outperforms prior arts by large margins. Code and data splits are available at https://github.com/naver-ai/i-Blurry.

Chat is not available.