arXiv Analytics

Sign in

arXiv:2108.02129 [math.OC]AbstractReferencesReviewsResources

Convergence results of a nested decentralized gradient method for non-strongly convex problems

Woocheol Choi, Doheon Kim, Seok-Bae Yun

Published 2021-08-04Version 1

We are concerned with the convergence of NEAR-DGD$^+$ (Nested Exact Alternating Recursion Distributed Graident Descent) method introduced to solve the distributed optimization problems. Under the assumption of strong convexity and Lipschitz continuous gradient, the linear convergence is established in \cite{BBKW - Near DGD}. In this paper, we investigate the convergence property of NEAR-DGD$^+$ in the absence of strong convexity. More precisely, we establish the convergence result in the case where only the convexity or the quasi-strong convexity is assumed on the objective function in place of the strong convexity. Numerical results are provided to support the convergence results.

Related articles: Most relevant | Search more
arXiv:2006.09097 [math.OC] (Published 2020-06-16)
Accelerated Alternating Minimization and Adaptability to Strong Convexity
arXiv:2312.03583 [math.OC] (Published 2023-12-06)
Strong Convexity of Sets in Riemannian Manifolds
arXiv:1504.03087 [math.OC] (Published 2015-04-13)
Iteration Complexity Analysis of Multi-Block ADMM for a Family of Convex Minimization without Strong Convexity