arXiv Analytics

Sign in

arXiv:2207.03804 [cs.LG]AbstractReferencesReviewsResources

On the Subspace Structure of Gradient-Based Meta-Learning

Gustaf Tegnér, Alfredo Reichlin, Hang Yin, Mårten Björkman, Danica Kragic

Published 2022-07-08Version 1

In this work we provide an analysis of the distribution of the post-adaptation parameters of Gradient-Based Meta-Learning (GBML) methods. Previous work has noticed how, for the case of image-classification, this adaption only takes place on the last layers of the network. We propose the more general notion that parameters are updated over a low-dimensional \emph{subspace} of the same dimensionality as the task-space and show that this holds for regression as well. Furthermore, the induced subspace structure provides a method to estimate the intrinsic dimension of the space of tasks of common few-shot learning datasets.

Related articles: Most relevant | Search more
arXiv:2502.05075 [cs.LG] (Published 2025-02-07)
Discrepancies are Virtue: Weak-to-Strong Generalization through Lens of Intrinsic Dimension
arXiv:2211.13239 [cs.LG] (Published 2022-11-23)
Relating Regularization and Generalization through the Intrinsic Dimension of Activations
arXiv:1804.08838 [cs.LG] (Published 2018-04-24)
Measuring the Intrinsic Dimension of Objective Landscapes