arXiv Analytics

Sign in

arXiv:2002.04274 [cs.LG]AbstractReferencesReviewsResources

Meta-Learning across Meta-Tasks for Few-Shot Learning

Nanyi Fei, Zhiwu Lu, Yizhao Gao, Jia Tian, Tao Xiang, Ji-Rong Wen

Published 2020-02-11Version 1

Existing meta-learning based few-shot learning (FSL) methods typically adopt an episodic training strategy whereby each episode contains a meta-task. Across episodes, these tasks are sampled randomly and their relationships are ignored. In this paper, we argue that the inter-meta-task relationships should be exploited to learn models that are more generalizable to unseen classes with few-shots. Specifically, we consider the relationships between two types of meta-tasks and propose different strategies to exploit them. (1) Two meta-tasks with disjoint sets of classes: these are interesting because their relationship is reminiscent of that between the source seen classes and target unseen classes, featured with domain gap caused by class differences. A novel meta-training strategy named meta-domain adaptation (MDA) is proposed to make the meta-learned model more robust to the domain gap. (2) Two meta-tasks with identical sets of classes: these are interesting because they can be used to learn models that are robust against poorly sampled few-shots. To that end, a novel meta-knowledge distillation (MKD) strategy is formulated. Extensive experiments demonstrate that both MDA and MKD significantly boost the performance of a variety of existing FSL methods and thus achieve new state-of-the-art on three benchmarks.

Related articles: Most relevant | Search more
arXiv:1810.09502 [cs.LG] (Published 2018-10-22)
How to train your MAML
arXiv:1902.04552 [cs.LG] (Published 2019-02-12)
Infinite Mixture Prototypes for Few-Shot Learning
arXiv:2002.02050 [cs.LG] (Published 2020-02-06)
Few-Shot Learning as Domain Adaptation: Algorithm and Analysis