arXiv Analytics

Sign in

arXiv:2006.13748 [cs.CV]AbstractReferencesReviewsResources

Insights from the Future for Continual Learning

Arthur Douillard, Eduardo Valle, Charles Ollion, Thomas Robert, Matthieu Cord

Published 2020-06-24Version 1

Continual learning aims to learn tasks sequentially, with (often severe) constraints on the storage of old learning samples, without suffering from catastrophic forgetting. In this work, we propose prescient continual learning, a novel experimental setting, to incorporate existing information about the classes, prior to any training data. Usually, each task in a traditional continual learning setting evaluates the model on present and past classes, the latter with a limited number of training samples. Our setting adds future classes, with no training samples at all. We introduce Ghost Model, a representation-learning-based model for continual learning using ideas from zero-shot learning. A generative model of the representation space in concert with a careful adjustment of the losses allows us to exploit insights from future classes to constraint the spatial arrangement of the past and current classes. Quantitative results on the AwA2 and aP\&Y datasets and detailed visualizations showcase the interest of this new setting and the method we propose to address it.

Related articles: Most relevant | Search more
arXiv:2405.10489 [cs.CV] (Published 2024-05-17)
MixCut:A Data Augmentation Method for Facial Expression Recognition
arXiv:2108.05613 [cs.CV] (Published 2021-08-12)
Cascade Bagging for Accuracy Prediction with Few Training Samples
arXiv:2210.15194 [cs.CV] (Published 2022-10-27)
Few-shot Image Generation via Masked Discrimination