arXiv Analytics

Sign in

arXiv:2002.02050 [cs.LG]AbstractReferencesReviewsResources

Few-Shot Learning as Domain Adaptation: Algorithm and Analysis

Jiechao Guan, Zhiwu Lu, Tao Xiang, Ji-Rong Wen

Published 2020-02-06Version 1

To recognize the unseen classes with only few samples, few-shot learning (FSL) uses prior knowledge learned from the seen classes. A major challenge for FSL is that the distribution of the unseen classes is different from that of those seen, resulting in poor generalization even when a model is meta-trained on the seen classes. This class-difference-caused distribution shift can be considered as a special case of domain shift. In this paper, for the first time, we propose a domain adaptation prototypical network with attention (DAPNA) to explicitly tackle such a domain shift problem in a meta-learning framework. Specifically, armed with a set transformer based attention module, we construct each episode with two sub-episodes without class overlap on the seen classes to simulate the domain shift between the seen and unseen classes. To align the feature distributions of the two sub-episodes with limited training samples, a feature transfer network is employed together with a margin disparity discrepancy (MDD) loss. Importantly, theoretical analysis is provided to give the learning bound of our DAPNA. Extensive experiments show that our DAPNA outperforms the state-of-the-art FSL alternatives, often by significant margins.

Related articles: Most relevant | Search more
arXiv:2004.00251 [cs.LG] (Published 2020-04-01)
Self-Augmentation: Generalizing Deep Networks to Unseen Classes for Few-Shot Learning
arXiv:1703.05175 [cs.LG] (Published 2017-03-15)
Prototypical Networks for Few-shot Learning
arXiv:1910.09446 [cs.LG] (Published 2019-10-21)
Zero-shot Learning via Simultaneous Generating and Learning