arXiv Analytics

Sign in

arXiv:2106.09663 [math.OC]AbstractReferencesReviewsResources

A Short Note of PAGE: Optimal Convergence Rates for Nonconvex Optimization

Zhize Li

Published 2021-06-17Version 1

In this note, we first recall the nonconvex problem setting and introduce the optimal PAGE algorithm (Li et al., ICML'21). Then we provide a simple and clean convergence analysis of PAGE for achieving optimal convergence rates. Moreover, PAGE and its analysis can be easily adopted and generalized to other works. We hope that this note provides the insights and is helpful for future works.

Related articles: Most relevant | Search more
arXiv:1607.08254 [math.OC] (Published 2016-07-27)
Stochastic Frank-Wolfe Methods for Nonconvex Optimization
arXiv:2108.11713 [math.OC] (Published 2021-08-26)
The Number of Steps Needed for Nonconvex Optimization of a Deep Learning Optimizer is a Rational Function of Batch Size
arXiv:2002.11582 [math.OC] (Published 2020-02-26)
Proximal Gradient Algorithm with Momentum and Flexible Parameter Restart for Nonconvex Optimization