arXiv Analytics

Sign in

arXiv:2408.16189 [stat.ML]AbstractReferencesReviewsResources

A More Unified Theory of Transfer Learning

Steve Hanneke, Samory Kpotufe

Published 2024-08-29Version 1

We show that some basic moduli of continuity $\delta$ -- which measure how fast target risk decreases as source risk decreases -- appear to be at the root of many of the classical relatedness measures in transfer learning and related literature. Namely, bounds in terms of $\delta$ recover many of the existing bounds in terms of other measures of relatedness -- both in regression and classification -- and can at times be tighter. We are particularly interested in general situations where the learner has access to both source data and some or no target data. The unified perspective allowed by the moduli $\delta$ allow us to extend many existing notions of relatedness at once to these scenarios involving target data: interestingly, while $\delta$ itself might not be efficiently estimated, adaptive procedures exist -- based on reductions to confidence sets -- which can get nearly tight rates in terms of $\delta$ with no prior distributional knowledge. Such adaptivity to unknown $\delta$ immediately implies adaptivity to many classical relatedness notions, in terms of combined source and target samples' sizes.

Related articles: Most relevant | Search more
arXiv:2111.11223 [stat.ML] (Published 2021-11-22, updated 2022-03-15)
Transfer Learning with Gaussian Processes for Bayesian Optimization
arXiv:2305.00520 [stat.ML] (Published 2023-04-30)
The ART of Transfer Learning: An Adaptive and Robust Pipeline
arXiv:2401.12272 [stat.ML] (Published 2024-01-22)
Transfer Learning for Nonparametric Regression: Non-asymptotic Minimax Analysis and Adaptive Procedure