arXiv:2409.07122 [math.OC]AbstractReferencesReviewsResources
Two Decentralized Conjugate Gradient Methods with Global Convergence
Liping Wang, Hao Wu, Hongchao Zhang
Published 2024-09-11Version 1
This paper considers the decentralized optimization problem of minimizing a finite sum of continuously differentiable functions over a fixed-connected undirected network. Summarizing the lack of previously developed decentralized conjugate gradient methods, we propose two decentralized conjugate gradient method, called NDCG and DMBFGS respectively. Firstly, the best of our knowledge, NDCG is the first decentralized conjugate gradient method to be shown to have global convergence with constant stepsizes for general nonconvex optimization problems, which profits from our designed conjugate parameter and relies only on the same mild conditions as the centralized conjugate gradient method. Secondly, we apply the memoryless BFGS technique and develop the DMBFGS method. It requires only vector-vector products to capture the curvature information of Hessian matrices. Under proper choice of stepsizes, DMBFGS has global linear convergence for solving strongly convex decentralized optimization problems. Our numerical results show DMBFGS is very efficient compared with other state-of-the-art methods for solving decentralized optimization.