arXiv Analytics

Sign in

arXiv:2112.04564 [cs.CV]AbstractReferencesReviewsResources

CoSSL: Co-Learning of Representation and Classifier for Imbalanced Semi-Supervised Learning

Yue Fan, Dengxin Dai, Bernt Schiele

Published 2021-12-08, updated 2022-05-13Version 2

In this paper, we propose a novel co-learning framework (CoSSL) with decoupled representation learning and classifier learning for imbalanced SSL. To handle the data imbalance, we devise Tail-class Feature Enhancement (TFE) for classifier learning. Furthermore, the current evaluation protocol for imbalanced SSL focuses only on balanced test sets, which has limited practicality in real-world scenarios. Therefore, we further conduct a comprehensive evaluation under various shifted test distributions. In experiments, we show that our approach outperforms other methods over a large range of shifted distributions, achieving state-of-the-art performance on benchmark datasets ranging from CIFAR-10, CIFAR-100, ImageNet, to Food-101. Our code will be made publicly available.

Comments: Published at CVPR 2022 as a conference paper. Code at https://github.com/YUE-FAN/CoSSL
Categories: cs.CV, cs.LG
Related articles: Most relevant | Search more
arXiv:2303.07269 [cs.CV] (Published 2023-03-13)
InPL: Pseudo-labeling the Inliers First for Imbalanced Semi-supervised Learning
arXiv:2106.00209 [cs.CV] (Published 2021-06-01)
Rethinking Re-Sampling in Imbalanced Semi-Supervised Learning
arXiv:2311.01646 [cs.CV] (Published 2023-11-03)
SemiGPC: Distribution-Aware Label Refinement for Imbalanced Semi-Supervised Learning Using Gaussian Processes