arXiv Analytics

Sign in

arXiv:2304.06619 [cs.CV]AbstractReferencesReviewsResources

Class-Incremental Learning of Plant and Disease Detection: Growing Branches with Knowledge Distillation

Mathieu Pagé Fortin

Published 2023-04-13Version 1

This paper investigates the problem of class-incremental object detection for agricultural applications where a model needs to learn new plant species and diseases incrementally without forgetting the previously learned ones. We adapt two public datasets to include new categories over time, simulating a more realistic and dynamic scenario. We then compare three class-incremental learning methods that leverage different forms of knowledge distillation to mitigate catastrophic forgetting. Our experiments show that all three methods suffer from catastrophic forgetting, but the recent Dynamic Y-KD approach, which additionally uses a dynamic architecture that grows new branches to learn new tasks, outperforms ILOD and Faster-ILOD in most scenarios both on new and old classes. These results highlight the challenges and opportunities of continual object detection for agricultural applications. In particular, the large intra-class and small inter-class variability that is typical of plant images exacerbate the difficulty of learning new categories without interfering with previous knowledge. We publicly release our code to encourage future work.

Related articles: Most relevant | Search more
arXiv:1907.09643 [cs.CV] (Published 2019-07-23)
Highlight Every Step: Knowledge Distillation via Collaborative Teaching
arXiv:2108.06681 [cs.CV] (Published 2021-08-15)
Multi-granularity for knowledge distillation
arXiv:2108.00587 [cs.CV] (Published 2021-08-02)
Semi-Supervising Learning, Transfer Learning, and Knowledge Distillation with SimCLR