arXiv Analytics

Sign in

arXiv:2006.01683 [cs.CV]AbstractReferencesReviewsResources

Channel Distillation: Channel-Wise Attention for Knowledge Distillation

Zaida Zhou, Chaoran Zhuge, Xinwei Guan, Wen Liu

Published 2020-06-02Version 1

Knowledge distillation is to transfer the knowledge from the data learned by the teacher network to the student network, so that the student has the advantage of less parameters and less calculations, and the accuracy is close to the teacher. In this paper, we propose a new distillation method, which contains two transfer distillation strategies and a loss decay strategy. The first transfer strategy is based on channel-wise attention, called Channel Distillation (CD). CD transfers the channel information from the teacher to the student. The second is Guided Knowledge Distillation (GKD). Unlike Knowledge Distillation (KD), which allows the student to mimic each sample's prediction distribution of the teacher, GKD only enables the student to mimic the correct output of the teacher. The last part is Early Decay Teacher (EDT). During the training process, we gradually decay the weight of the distillation loss. The purpose is to enable the student to gradually control the optimization rather than the teacher. Our proposed method is evaluated on ImageNet and CIFAR100. On ImageNet, we achieve 27.68% of top-1 error with ResNet18, which outperforms state-of-the-art methods. On CIFAR100, we achieve surprising result that the student outperforms the teacher. Code is available at https://github.com/zhouzaida/channel-distillation.

Related articles: Most relevant | Search more
arXiv:1611.05594 [cs.CV] (Published 2016-11-17)
SCA-CNN: Spatial and Channel-wise Attention in Convolutional Networks for Image Captioning
arXiv:2305.07586 [cs.CV] (Published 2023-05-12)
Knowledge distillation with Segment Anything (SAM) model for Planetary Geological Mapping
arXiv:2304.06619 [cs.CV] (Published 2023-04-13)
Class-Incremental Learning of Plant and Disease Detection: Growing Branches with Knowledge Distillation