arXiv Analytics

Sign in

arXiv:2209.12467 [math.OC]AbstractReferencesReviewsResources

Convergence rate of the (1+1)-evolution strategy on locally strongly convex functions with lipschitz continuous gradient and their monotonic transformations

Daiki Morinaga, Kazuto Fukuchi, Jun Sakuma, Youhei Akimoto

Published 2022-09-26Version 1

Evolution strategy (ES) is one of promising classes of algorithms for black-box continuous optimization. Despite its broad successes in applications, theoretical analysis on the speed of its convergence is limited on convex quadratic functions and their monotonic transformation.%theoretically how fast it converges to a optima on convex functions is still vague. In this study, an upper bound and a lower bound of the rate of linear convergence of the (1+1)-ES on locally $L$-strongly convex functions with $U$-Lipschitz continuous gradient are derived as $\exp\left(-\Omega_{d\to\infty}\left(\frac{L}{d\cdot U}\right)\right)$ and $\exp\left(-\frac1d\right)$, respectively. Notably, any prior knowledge on the mathematical properties of the objective function such as Lipschitz constant is not given to the algorithm, whereas the existing analyses of derivative-free optimization algorithms require them.

Related articles: Most relevant | Search more
arXiv:2002.06315 [math.OC] (Published 2020-02-15)
Bregman Augmented Lagrangian and Its Acceleration
arXiv:1204.0301 [math.OC] (Published 2012-04-02)
Tree Codes Improve Convergence Rate of Consensus Over Erasure Channels
arXiv:1911.05979 [math.OC] (Published 2019-11-14)
Towards an $O(\frac{1}{t})$ convergence rate for distributed dual averaging