arXiv Analytics

Sign in

arXiv:1611.08022 [math.OC]AbstractReferencesReviewsResources

A Stronger Convergence Result on the Proximal Incremental Aggregated Gradient Method

Nuri Denizcan Vanli, Mert Gurbuzbalaban, Asu Ozdaglar

Published 2016-11-23Version 1

We study the convergence rate of the proximal incremental aggregated gradient (PIAG) method for minimizing the sum of a large number of smooth component functions (where the sum is strongly convex) and a non-smooth convex function. At each iteration, the PIAG method moves along an aggregated gradient formed by incrementally updating gradients of component functions at least once in the last $K$ iterations and takes a proximal step with respect to the non-smooth function. We show that the PIAG algorithm attains an iteration complexity that grows linear in the condition number of the problem and the delay parameter $K$. This improves upon the previously best known global linear convergence rate of the PIAG algorithm in the literature which has a quadratic dependence on $K$.

Related articles: Most relevant | Search more
arXiv:1702.08166 [math.OC] (Published 2017-02-27)
Linear Convergence of the Proximal Incremental Aggregated Gradient Method under Quadratic Growth Condition
arXiv:2012.12250 [math.OC] (Published 2020-12-22)
Iteratively Reweighted Least Squares for $\ell_1$-minimization with Global Linear Convergence Rate
arXiv:1712.00984 [math.OC] (Published 2017-12-04)
Inertial Proximal Incremental Aggregated Gradient Method