{ "id": "2207.03804", "version": "v1", "published": "2022-07-08T10:19:15.000Z", "updated": "2022-07-08T10:19:15.000Z", "title": "On the Subspace Structure of Gradient-Based Meta-Learning", "authors": [ "Gustaf Tegnér", "Alfredo Reichlin", "Hang Yin", "Mårten Björkman", "Danica Kragic" ], "categories": [ "cs.LG" ], "abstract": "In this work we provide an analysis of the distribution of the post-adaptation parameters of Gradient-Based Meta-Learning (GBML) methods. Previous work has noticed how, for the case of image-classification, this adaption only takes place on the last layers of the network. We propose the more general notion that parameters are updated over a low-dimensional \\emph{subspace} of the same dimensionality as the task-space and show that this holds for regression as well. Furthermore, the induced subspace structure provides a method to estimate the intrinsic dimension of the space of tasks of common few-shot learning datasets.", "revisions": [ { "version": "v1", "updated": "2022-07-08T10:19:15.000Z" } ], "analyses": { "keywords": [ "gradient-based meta-learning", "common few-shot learning datasets", "induced subspace structure", "intrinsic dimension", "post-adaptation parameters" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }