arXiv:2207.06304 [math.OC]AbstractReferencesReviewsResources
On the Iteration Complexity of Smoothed Proximal ALM for Nonconvex Optimization Problem with Convex Constraints
Jiawei Zhang, Wenqiang Pu, Zhi-Quan Luo
Published 2022-07-13Version 1
It is well-known that the lower bound of iteration complexity for solving nonconvex unconstrained optimization problems is $\Omega(1/\epsilon^2)$, which can be achieved by standard gradient descent algorithm when the objective function is smooth. This lower bound still holds for nonconvex constrained problems, while it is still unknown whether a first-order method can achieve this lower bound. In this paper, we show that a simple single-loop first-order algorithm called smoothed proximal augmented Lagrangian method (ALM) can achieve such iteration complexity lower bound. The key technical contribution is a strong local error bound for a general convex constrained problem, which is of independent interest.