arXiv Analytics

Sign in

arXiv:1906.01537 [stat.ML]AbstractReferencesReviewsResources

Bayesian Optimization of Composite Functions

Raul Astudillo, Peter I. Frazier

Published 2019-06-04Version 1

We consider optimization of composite objective functions, i.e., of the form $f(x)=g(h(x))$, where $h$ is a black-box derivative-free expensive-to-evaluate function with vector-valued outputs, and $g$ is a cheap-to-evaluate real-valued function. While these problems can be solved with standard Bayesian optimization, we propose a novel approach that exploits the composite structure of the objective function to substantially improve sampling efficiency. Our approach models $h$ using a multi-output Gaussian process and chooses where to sample using the expected improvement evaluated on the implied non-Gaussian posterior on $f$, which we call expected improvement for composite functions (\ei). Although \ei\ cannot be computed in closed form, we provide a novel stochastic gradient estimator that allows its efficient maximization. We also show that our approach is asymptotically consistent, i.e., that it recovers a globally optimal solution as sampling effort grows to infinity, generalizing previous convergence results for classical expected improvement. Numerical experiments show that our approach dramatically outperforms standard Bayesian optimization benchmarks, reducing simple regret by several orders of magnitude.

Comments: In Proceedings of the 36th International Conference on Machine Learning, PMLR 97:354-363, 2019
Journal: In Proceedings of the 36th International Conference on Machine Learning, PMLR 97:354-363, 2019
Categories: stat.ML, cs.LG, math.OC
Related articles: Most relevant | Search more
arXiv:1805.07960 [stat.ML] (Published 2018-05-21)
Stochastic Gradient Descent for Stochastic Doubly-Nonconvex Composite Optimization
arXiv:2501.09262 [stat.ML] (Published 2025-01-16)
On the convergence of noisy Bayesian Optimization with Expected Improvement
arXiv:2501.18756 [stat.ML] (Published 2025-01-30)
A Unified Framework for Entropy Search and Expected Improvement in Bayesian Optimization