arXiv:2308.09556 [math.OC]AbstractReferencesReviewsResources
A Principle for Global Optimization with Gradients
Published 2023-08-18Version 1
This work demonstrates the utility of gradients for the global optimization of certain differentiable functions with many suboptimal local minima. To this end, a principle for generating search directions from non-local quadratic approximants based on gradients of the objective function is analyzed. Experiments measure the quality of non-local search directions as well as the performance of a proposed simplistic algorithm, of the covariance matrix adaptation evolution strategy (CMA-ES), and of a randomly reinitialized Broyden-Fletcher-Goldfarb-Shanno (BFGS) method.
Comments: 16 pages, 3 figures
Related articles: Most relevant | Search more
arXiv:1707.02126 [math.OC] (Published 2017-07-07)
Global Optimization with Orthogonality Constraints via Stochastic Diffusion on Manifold
arXiv:1307.2791 [math.OC] (Published 2013-07-10)
The Effect of Hessian Evaluations in the Global Optimization αBB Method
arXiv:2107.12102 [math.OC] (Published 2021-07-26)
Global optimization using random embeddings