arXiv Analytics

Sign in

arXiv:2207.06304 [math.OC]AbstractReferencesReviewsResources

On the Iteration Complexity of Smoothed Proximal ALM for Nonconvex Optimization Problem with Convex Constraints

Jiawei Zhang, Wenqiang Pu, Zhi-Quan Luo

Published 2022-07-13Version 1

It is well-known that the lower bound of iteration complexity for solving nonconvex unconstrained optimization problems is $\Omega(1/\epsilon^2)$, which can be achieved by standard gradient descent algorithm when the objective function is smooth. This lower bound still holds for nonconvex constrained problems, while it is still unknown whether a first-order method can achieve this lower bound. In this paper, we show that a simple single-loop first-order algorithm called smoothed proximal augmented Lagrangian method (ALM) can achieve such iteration complexity lower bound. The key technical contribution is a strong local error bound for a general convex constrained problem, which is of independent interest.

Related articles: Most relevant | Search more
arXiv:2209.09655 [math.OC] (Published 2022-09-20)
Lower Bounds on the Worst-Case Complexity of Efficient Global Optimization
arXiv:2211.10234 [math.OC] (Published 2022-11-18)
Iteration Complexity of Fixed-Step-Momentum Methods for Convex Quadratic Functions
arXiv:1606.01793 [math.OC] (Published 2016-06-06)
Low-rank optimization with convex constraints