arXiv Analytics

Sign in

arXiv:1602.06661 [math.OC]AbstractReferencesReviewsResources

Error bounds, quadratic growth, and linear convergence of proximal methods

Dmitriy Drusvyatskiy, Adrian S. Lewis

Published 2016-02-22Version 1

We show that the the error bound property, postulating that the step lengths of the proximal gradient method linearly bound the distance to the solution set, is equivalent to a standard quadratic growth condition. We exploit this equivalence in an analysis of asymptotic linear convergence of the proximal gradient algorithm for structured problems, which lack strong convexity. More generally still, we analyze local linear convergence guarantees of a proximal method for minimizing compositions of convex functions with smooth mappings.

Related articles: Most relevant | Search more
arXiv:1712.06221 [math.OC] (Published 2017-12-18)
Amenable cones: error bounds without constraint qualifications
arXiv:1510.08234 [math.OC] (Published 2015-10-28)
From error bounds to the complexity of first-order descent methods for convex functions
arXiv:2311.01758 [math.OC] (Published 2023-11-03)
On error bounds and optimality conditions at infinity