arXiv Analytics

Sign in

arXiv:1711.09959 [math.OC]AbstractReferencesReviewsResources

On the convergence rate of the scaled proximal decomposition on the graph of a maximal monotone operator (SPDG) algorithm

Samara C Lima, Maicon Alves

Published 2017-11-27Version 1

Relying on fixed point techniques, Mahey, Oualibouch and Tao introduced the scaled proximal decomposition on the graph of a maximal monotone operator (SPDG) algorithm and analyzed its performance on inclusions for strongly monotone and Lipschitz continuous operators. The SPDG algorithm generalizes the Spingarn's partial inverse method by allowing scaling factors, a key strategy to speed up the convergence of numerical algorithms. In this note, we show that the SPDG algorithm can alternatively be analyzed by means of the original Spingarn's partial inverse framework, tracing back to the 1983 Spingarn's paper. We simply show that under the assumptions considered by Mahey, Oualibouch and Tao, the Spingarn's partial inverse of the underlying maximal monotone operator is strongly monotone, which allows one to employ recent results on the convergence and iteration-complexity of proximal point type methods for strongly monotone operators. By doing this, we additionally obtain a potentially faster convergence for the SPDG algorithm and a more accurate upper bound on the number of iterations needed to achieve prescribed tolerances, specially on ill-conditioned problems.

Related articles: Most relevant | Search more
arXiv:0904.4229 [math.OC] (Published 2009-04-27)
Convergence Rate of Stochastic Gradient Search in the Case of Multiple and Non-Isolated Minima
arXiv:1204.0301 [math.OC] (Published 2012-04-02)
Tree Codes Improve Convergence Rate of Consensus Over Erasure Channels
arXiv:1503.05601 [math.OC] (Published 2015-03-18)
A New Perspective of Proximal Gradient Algorithms