arXiv:2209.12463 [math.OC]AbstractReferencesReviewsResources
On the Complexity of Deterministic Nonsmooth and Nonconvex Optimization
Michael I. Jordan, Tianyi Lin, Manolis Zampetakis
Published 2022-09-26Version 1
In this paper, we present several new results on minimizing a nonsmooth and nonconvex function under a Lipschitz condition. Recent work suggests that while the classical notion of Clarke stationarity is computationally intractable up to a sufficiently small constant tolerance, randomized first-order algorithms find a $(\delta, \epsilon)$-Goldstein stationary point with the complexity bound of $O(\delta^{-1}\epsilon^{-3})$, which is independent of problem dimension~\citep{Zhang-2020-Complexity, Davis-2021-Gradient, Tian-2022-Finite}. However, deterministic algorithms have not been fully explored, leaving open several basic problems in nonconvex and nonsmooth optimization. Our first contribution is to demonstrate that the randomization is essential to obtain a dimension-free complexity guarantee, by providing a lower bound of $\Omega(\sqrt{d})$ for all deterministic algorithms that have access to both first and zeroth-order oracles. Further, we show that zeroth-order oracle is essential to obtain a finite-time convergence guarantee, by proving that deterministic algorithms with only a first-order oracle can not find an approximate Goldstein stationary point within a finite number of iterations up to some small constant tolerance. Finally, we propose a deterministic smoothing approach that induces a smoothness parameter which is exponential in a parameter $M > 0$, and design a new deterministic algorithm with a dimension-free complexity bound of $\tilde{O}(M\delta^{-1}\epsilon^{-3})$.