arXiv Analytics

Sign in

arXiv:2209.12463 [math.OC]AbstractReferencesReviewsResources

On the Complexity of Deterministic Nonsmooth and Nonconvex Optimization

Michael I. Jordan, Tianyi Lin, Manolis Zampetakis

Published 2022-09-26Version 1

In this paper, we present several new results on minimizing a nonsmooth and nonconvex function under a Lipschitz condition. Recent work suggests that while the classical notion of Clarke stationarity is computationally intractable up to a sufficiently small constant tolerance, randomized first-order algorithms find a $(\delta, \epsilon)$-Goldstein stationary point with the complexity bound of $O(\delta^{-1}\epsilon^{-3})$, which is independent of problem dimension~\citep{Zhang-2020-Complexity, Davis-2021-Gradient, Tian-2022-Finite}. However, deterministic algorithms have not been fully explored, leaving open several basic problems in nonconvex and nonsmooth optimization. Our first contribution is to demonstrate that the randomization is essential to obtain a dimension-free complexity guarantee, by providing a lower bound of $\Omega(\sqrt{d})$ for all deterministic algorithms that have access to both first and zeroth-order oracles. Further, we show that zeroth-order oracle is essential to obtain a finite-time convergence guarantee, by proving that deterministic algorithms with only a first-order oracle can not find an approximate Goldstein stationary point within a finite number of iterations up to some small constant tolerance. Finally, we propose a deterministic smoothing approach that induces a smoothness parameter which is exponential in a parameter $M > 0$, and design a new deterministic algorithm with a dimension-free complexity bound of $\tilde{O}(M\delta^{-1}\epsilon^{-3})$.

Related articles: Most relevant | Search more
arXiv:2002.11582 [math.OC] (Published 2020-02-26)
Proximal Gradient Algorithm with Momentum and Flexible Parameter Restart for Nonconvex Optimization
arXiv:2302.10065 [math.OC] (Published 2023-02-20)
Yet another fast variant of Newton's method for nonconvex optimization
arXiv:1607.08254 [math.OC] (Published 2016-07-27)
Stochastic Frank-Wolfe Methods for Nonconvex Optimization