{ "id": "2209.12467", "version": "v1", "published": "2022-09-26T07:16:50.000Z", "updated": "2022-09-26T07:16:50.000Z", "title": "Convergence rate of the (1+1)-evolution strategy on locally strongly convex functions with lipschitz continuous gradient and their monotonic transformations", "authors": [ "Daiki Morinaga", "Kazuto Fukuchi", "Jun Sakuma", "Youhei Akimoto" ], "comment": "15 pages", "categories": [ "math.OC", "cs.NE" ], "abstract": "Evolution strategy (ES) is one of promising classes of algorithms for black-box continuous optimization. Despite its broad successes in applications, theoretical analysis on the speed of its convergence is limited on convex quadratic functions and their monotonic transformation.%theoretically how fast it converges to a optima on convex functions is still vague. In this study, an upper bound and a lower bound of the rate of linear convergence of the (1+1)-ES on locally $L$-strongly convex functions with $U$-Lipschitz continuous gradient are derived as $\\exp\\left(-\\Omega_{d\\to\\infty}\\left(\\frac{L}{d\\cdot U}\\right)\\right)$ and $\\exp\\left(-\\frac1d\\right)$, respectively. Notably, any prior knowledge on the mathematical properties of the objective function such as Lipschitz constant is not given to the algorithm, whereas the existing analyses of derivative-free optimization algorithms require them.", "revisions": [ { "version": "v1", "updated": "2022-09-26T07:16:50.000Z" } ], "analyses": { "subjects": [ "65K05", "90C25", "90C26", "90C56", "90C59", "G.1.6" ], "keywords": [ "lipschitz continuous gradient", "locally strongly convex functions", "monotonic transformations", "convergence rate", "derivative-free optimization algorithms" ], "note": { "typesetting": "TeX", "pages": 15, "language": "en", "license": "arXiv", "status": "editable" } } }