arXiv Analytics

Sign in

arXiv:1805.05411 [math.OC]AbstractReferencesReviewsResources

Accelerated Stochastic Algorithms for Nonconvex Finite-sum and Multi-block Optimization

Guanghui Lan, Yu Yang

Published 2018-05-14Version 1

In this paper, we present new stochastic methods for solving two important classes of nonconvex optimization problems. We first introduce a randomized accelerated proximal gradient (RapGrad) method for solving a class of nonconvex optimization problems consisting of the sum of $m$ component functions, and show that it can significantly reduce the number of gradient computations especially when the condition number $L/\mu$ (i.e., the ratio between the Lipschitz constant and negative curvature) is large. More specifically, RapGrad can save up to ${\cal O}(\sqrt{m})$ gradient computations than existing deterministic nonconvex accelerated gradient methods. Moreover, the number of gradient computations required by RapGrad can be ${\cal O}(m^\frac{1}{6} L^\frac{1}{2} / \mu^\frac{1}{2})$ (at least ${\cal O}(m^\frac{2}{3})$) times smaller than the best-known randomized nonconvex gradient methods when $L/\mu \ge m$. Inspired by RapGrad, we also develop a new randomized accelerated proximal dual (RapDual) method for solving a class of multi-block nonconvex optimization problems coupled with linear constraints. We demonstrate that RapDual can also save up to a factor of ${\cal O}(\sqrt{m})$ projection subproblems than its deterministic counterpart, where $m$ denotes the number of blocks. To the best of our knowledge, all these complexity results associated with RapGrad and RapDual seem to be new in the literature. We also illustrate potential advantages of these algorithms through our preliminary numerical experiments.

Related articles: Most relevant | Search more
arXiv:1905.08445 [math.OC] (Published 2019-05-21)
A Variational Approach on Level sets and Linear Convergence of Variable Bregman Proximal Gradient Method for Nonconvex Optimization Problems
arXiv:2406.15793 [math.OC] (Published 2024-06-22)
Complexity of Adagrad and other first-order methods for nonconvex optimization problems with bounds and convex constraints
arXiv:2011.09194 [math.OC] (Published 2020-11-18)
Lagrangian duality for nonconvex optimization problems with abstract convex functions