arXiv Analytics

Sign in

arXiv:2005.08304 [math.OC]AbstractReferencesReviewsResources

From Proximal Point Method to Nesterov's Acceleration

Kwangjun Ahn

Published 2020-05-17Version 1

The proximal point method (PPM) is a fundamental method in optimization that is often used as a building block for fast optimization algorithms. In this work, building on a recent work by Defazio (2019), we provide a complete understanding of Nesterov's accelerated gradient method (AGM) by establishing quantitative and analytical connections between PPM and AGM. The main observation in this paper is that AGM is in fact equal to a simple approximation of PPM, which results in an elementary derivation of the mysterious updates of AGM as well as its step sizes. This connection also leads to a conceptually simple analysis of AGM based on the standard analysis of PPM. This view naturally extends to the strongly convex case and also motivates other accelerated methods for practically relevant settings.

Related articles: Most relevant | Search more
arXiv:1907.13307 [math.OC] (Published 2019-07-31)
Robust stochastic optimization with the proximal point method
arXiv:1512.06081 [math.OC] (Published 2015-12-18)
Proximal Point Method for Vector Optimization on Hadamard Manifolds
arXiv:0809.2594 [math.OC] (Published 2008-09-15, updated 2010-04-13)
Proximal Point Method for a Special Class of Nonconvex Functions on Hadamard Manifolds