arXiv Analytics

Sign in

arXiv:2006.10236 [cs.LG]AbstractReferencesReviewsResources

Unsupervised Meta-Learning through Latent-Space Interpolation in Generative Models

Siavash Khodadadeh, Sharare Zehtabian, Saeed Vahidian, Weijia Wang, Bill Lin, Ladislau Bölöni

Published 2020-06-18Version 1

Unsupervised meta-learning approaches rely on synthetic meta-tasks that are created using techniques such as random selection, clustering and/or augmentation. Unfortunately, clustering and augmentation are domain-dependent, and thus they require either manual tweaking or expensive learning. In this work, we describe an approach that generates meta-tasks using generative models. A critical component is a novel approach of sampling from the latent space that generates objects grouped into synthetic classes forming the training and validation data of a meta-task. We find that the proposed approach, LAtent Space Interpolation Unsupervised Meta-learning (LASIUM), outperforms or is competitive with current unsupervised learning baselines on few-shot classification tasks on the most widely used benchmark datasets. In addition, the approach promises to be applicable without manual tweaking over a wider range of domains than previous approaches.

Related articles: Most relevant | Search more
arXiv:2110.04616 [cs.LG] (Published 2021-10-09, updated 2022-07-26)
Discriminative Multimodal Learning via Conditional Priors in Generative Models
arXiv:1903.09030 [cs.LG] (Published 2019-03-21)
Generative Models For Deep Learning with Very Scarce Data
arXiv:1904.03445 [cs.LG] (Published 2019-04-06)
Interpolation in generative models