arXiv Analytics

Sign in

arXiv:2003.08798 [cs.CV]AbstractReferencesReviewsResources

Incremental Object Detection via Meta-Learning

K J Joseph, Jathushan Rajasegaran, Salman Khan, Fahad Shahbaz Khan, Vineeth Balasubramanian, Ling Shao

Published 2020-03-17Version 1

In a real-world setting, object instances from new classes may be continuously encountered by object detectors. When existing object detectors are applied to such scenarios, their performance on old classes deteriorates significantly. A few efforts have been reported to address this limitation, all of which apply variants of knowledge distillation to avoid catastrophic forgetting. We note that although distillation helps to retain previous learning, it obstructs fast adaptability to new tasks, which is a critical requirement for incremental learning. In this pursuit, we propose a meta-learning approach that learns to reshape model gradients, such that information across incremental tasks is optimally shared. This ensures a seamless information transfer via a meta-learned gradient preconditioning that minimizes forgetting and maximizes knowledge transfer. In comparison to existing meta-learning methods, our approach is task-agnostic, allows incremental addition of new-classes and scales to large-sized models for object detection. We evaluate our approach on a variety of incremental settings defined on PASCAL-VOC and MS COCO datasets, demonstrating significant improvements over state-of-the-art.

Related articles: Most relevant | Search more
arXiv:2307.12427 [cs.CV] (Published 2023-07-23)
Augmented Box Replay: Overcoming Foreground Shift for Incremental Object Detection
arXiv:2204.02136 [cs.CV] (Published 2022-04-05)
Overcoming Catastrophic Forgetting in Incremental Object Detection via Elastic Response Distillation
arXiv:2304.03110 [cs.CV] (Published 2023-04-06)
Continual Detection Transformer for Incremental Object Detection