{ "id": "2111.11223", "version": "v2", "published": "2021-11-22T14:09:45.000Z", "updated": "2022-03-15T17:25:37.000Z", "title": "Transfer Learning with Gaussian Processes for Bayesian Optimization", "authors": [ "Petru Tighineanu", "Kathrin Skubch", "Paul Baireuther", "Attila Reiss", "Felix Berkenkamp", "Julia Vinogradska" ], "categories": [ "stat.ML", "cs.AI", "cs.LG" ], "abstract": "Bayesian optimization is a powerful paradigm to optimize black-box functions based on scarce and noisy data. Its data efficiency can be further improved by transfer learning from related tasks. While recent transfer models meta-learn a prior based on large amount of data, in the low-data regime methods that exploit the closed-form posterior of Gaussian processes (GPs) have an advantage. In this setting, several analytically tractable transfer-model posteriors have been proposed, but the relative advantages of these methods are not well understood. In this paper, we provide a unified view on hierarchical GP models for transfer learning, which allows us to analyze the relationship between methods. As part of the analysis, we develop a novel closed-form boosted GP transfer model that fits between existing approaches in terms of complexity. We evaluate the performance of the different approaches in large-scale experiments and highlight strengths and weaknesses of the different transfer-learning methods.", "revisions": [ { "version": "v2", "updated": "2022-03-15T17:25:37.000Z" } ], "analyses": { "keywords": [ "transfer learning", "gaussian processes", "bayesian optimization", "closed-form boosted gp transfer model", "novel closed-form boosted gp transfer" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }