arXiv Analytics

Sign in

arXiv:2409.07122 [math.OC]AbstractReferencesReviewsResources

Two Decentralized Conjugate Gradient Methods with Global Convergence

Liping Wang, Hao Wu, Hongchao Zhang

Published 2024-09-11Version 1

This paper considers the decentralized optimization problem of minimizing a finite sum of continuously differentiable functions over a fixed-connected undirected network. Summarizing the lack of previously developed decentralized conjugate gradient methods, we propose two decentralized conjugate gradient method, called NDCG and DMBFGS respectively. Firstly, the best of our knowledge, NDCG is the first decentralized conjugate gradient method to be shown to have global convergence with constant stepsizes for general nonconvex optimization problems, which profits from our designed conjugate parameter and relies only on the same mild conditions as the centralized conjugate gradient method. Secondly, we apply the memoryless BFGS technique and develop the DMBFGS method. It requires only vector-vector products to capture the curvature information of Hessian matrices. Under proper choice of stepsizes, DMBFGS has global linear convergence for solving strongly convex decentralized optimization problems. Our numerical results show DMBFGS is very efficient compared with other state-of-the-art methods for solving decentralized optimization.

Related articles: Most relevant | Search more
arXiv:1805.09545 [math.OC] (Published 2018-05-24)
On the Global Convergence of Gradient Descent for Over-parameterized Models using Optimal Transport
arXiv:1910.09496 [math.OC] (Published 2019-10-21)
Policy Optimization for $\mathcal{H}_2$ Linear Control with $\mathcal{H}_\infty$ Robustness Guarantee: Implicit Regularization and Global Convergence
arXiv:1203.2392 [math.OC] (Published 2012-03-12, updated 2014-11-27)
Global convergence of a non-convex Douglas-Rachford iteration