papers AI Learner
The Github is limit! Click to go to the new site.

Incremental Learning with Unlabeled Data in the Wild

2019-03-29
Kibok Lee, Kimin Lee, Jinwoo Shin, Honglak Lee

Abstract

Deep neural networks are known to suffer from catastrophic forgetting in class-incremental learning, where the performance on previous tasks drastically degrades when learning a new task. To alleviate this effect, we propose to leverage a continuous and large stream of unlabeled data in the wild. In particular, to leverage such transient external data effectively, we design a novel class-incremental learning scheme with (a) a new distillation loss, termed global distillation, (b) a learning strategy to avoid overfitting to the most recent task, and (c) a sampling strategy for the desired external data. Our experimental results on various datasets, including CIFAR and ImageNet, demonstrate the superiority of the proposed methods over prior methods, particularly when a stream of unlabeled data is accessible: we achieve up to 9.3% of relative performance improvement compared to the state-of-the-art method.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1903.12648

PDF

http://arxiv.org/pdf/1903.12648


Similar Posts

Comments