arXiv Analytics

Sign in

arXiv:1709.03384 [math.OC]AbstractReferencesReviewsResources

Ghost Penalties in Nonconvex Constrained Optimization: Diminishing Stepsizes and Iteration Complexity

Francisco Facchinei, Vyacheslav Kungurtsev, Lorenzo Lampariello, Gesualdo Scutari

Published 2017-09-11Version 1

We consider, for the first time, general diminishing stepsize methods for nonconvex, constrained optimization problems. We show that by using directions obtained in an SQP-like fashion, convergence to generalized stationary points can be proved. We then consider the iteration complexity of this method and some variants where the stepsize is either kept constant or decreased according to very simple rules. We establish convergence to {\delta}-approximate stationary points in at most O(\delta^-2), O(\delta^-3), or O(\delta^-4) iterations according to the assumptions made on the problem. These results complement nicely the very few existing results in the field.

Related articles: Most relevant | Search more
arXiv:2007.12219 [math.OC] (Published 2020-07-23)
A First-Order Primal-Dual Method for Nonconvex Constrained Optimization Based On the Augmented Lagrangian
arXiv:0709.1020 [math.OC] (Published 2007-09-07)
Evolution Strategies in Optimization Problems
arXiv:2105.08317 [math.OC] (Published 2021-05-18, updated 2022-04-19)
An Augmented Lagrangian Method for Optimization Problems with Structured Geometric Constraints