arXiv Analytics

Sign in

arXiv:1902.04552 [cs.LG]AbstractReferencesReviewsResources

Infinite Mixture Prototypes for Few-Shot Learning

Kelsey R. Allen, Evan Shelhamer, Hanul Shin, Joshua B. Tenenbaum

Published 2019-02-12Version 1

We propose infinite mixture prototypes to adaptively represent both simple and complex data distributions for few-shot learning. Our infinite mixture prototypes represent each class by a set of clusters, unlike existing prototypical methods that represent each class by a single cluster. By inferring the number of clusters, infinite mixture prototypes interpolate between nearest neighbor and prototypical representations, which improves accuracy and robustness in the few-shot regime. We show the importance of adaptive capacity for capturing complex data distributions such as alphabets, with 25% absolute accuracy improvements over prototypical networks, while still maintaining or improving accuracy on the standard Omniglot and mini-ImageNet benchmarks. In clustering labeled and unlabeled data by the same clustering rule, infinite mixture prototypes achieves state-of-the-art semi-supervised accuracy. As a further capability, we show that infinite mixture prototypes can perform purely unsupervised clustering, unlike existing prototypical methods.

Related articles: Most relevant | Search more
arXiv:1703.05175 [cs.LG] (Published 2017-03-15)
Prototypical Networks for Few-shot Learning
arXiv:1810.09502 [cs.LG] (Published 2018-10-22)
How to train your MAML
arXiv:2002.04274 [cs.LG] (Published 2020-02-11)
Meta-Learning across Meta-Tasks for Few-Shot Learning