arXiv Analytics

Sign in

arXiv:2101.00935 [math.OC]AbstractReferencesReviewsResources

First-Order Methods for Convex Optimization

Pavel Dvurechensky, Mathias Staudigl, Shimrit Shtern

Published 2021-01-04Version 1

First-order methods for solving convex optimization problems have been at the forefront of mathematical optimization in the last 20 years. The rapid development of this important class of algorithms is motivated by the success stories reported in various applications, including most importantly machine learning, signal processing, imaging and control theory. First-order methods have the potential to provide low accuracy solutions at low computational complexity which makes them an attractive set of tools in large-scale optimization problems. In this survey we cover a number of key developments in gradient-based optimization methods. This includes non-Euclidean extensions of the classical proximal gradient method, and its accelerated versions. Additionally we survey recent developments within the class of projection-free methods, and proximal versions of primal-dual schemes. We give complete proofs for various key results, and highlight the unifying aspects of several optimization algorithms.

Related articles: Most relevant | Search more
arXiv:1511.02974 [math.OC] (Published 2015-11-10)
New Computational Guarantees for Solving Convex Optimization Problems with First Order Methods, via a Function Growth Condition Measure
arXiv:1705.06164 [math.OC] (Published 2017-05-17)
A general framework for solving convex optimization problems involving the sum of three convex functions
arXiv:1207.3254 [math.OC] (Published 2012-07-13)
A variable smoothing algorithm for solving convex optimization problems