{ "id": "2002.04274", "version": "v1", "published": "2020-02-11T09:25:13.000Z", "updated": "2020-02-11T09:25:13.000Z", "title": "Meta-Learning across Meta-Tasks for Few-Shot Learning", "authors": [ "Nanyi Fei", "Zhiwu Lu", "Yizhao Gao", "Jia Tian", "Tao Xiang", "Ji-Rong Wen" ], "categories": [ "cs.LG", "stat.ML" ], "abstract": "Existing meta-learning based few-shot learning (FSL) methods typically adopt an episodic training strategy whereby each episode contains a meta-task. Across episodes, these tasks are sampled randomly and their relationships are ignored. In this paper, we argue that the inter-meta-task relationships should be exploited to learn models that are more generalizable to unseen classes with few-shots. Specifically, we consider the relationships between two types of meta-tasks and propose different strategies to exploit them. (1) Two meta-tasks with disjoint sets of classes: these are interesting because their relationship is reminiscent of that between the source seen classes and target unseen classes, featured with domain gap caused by class differences. A novel meta-training strategy named meta-domain adaptation (MDA) is proposed to make the meta-learned model more robust to the domain gap. (2) Two meta-tasks with identical sets of classes: these are interesting because they can be used to learn models that are robust against poorly sampled few-shots. To that end, a novel meta-knowledge distillation (MKD) strategy is formulated. Extensive experiments demonstrate that both MDA and MKD significantly boost the performance of a variety of existing FSL methods and thus achieve new state-of-the-art on three benchmarks.", "revisions": [ { "version": "v1", "updated": "2020-02-11T09:25:13.000Z" } ], "analyses": { "keywords": [ "few-shot learning", "domain gap", "relationship", "learn models", "target unseen classes" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }