arXiv Analytics

Sign in

arXiv:1802.03347 [math.OC]AbstractReferencesReviewsResources

Acceleration and global convergence of a first-order primal--dual method for nonconvex problems

Christian Clason, Stanislav Mazurenko, Tuomo Valkonen

Published 2018-02-09Version 1

First-order primal-dual algorithms are the backbone for mathematical image processing and more general inverse problems that can be formulated as convex optimization problems. Recent advances have extended their applicability to areas previously dominated by second-order algorithms, such as nonconvex problems arising in optimal control. Nonetheless, the application of first-order primal-dual algorithms to nonconvex large-scale optimization still requires further investigation. In this paper, we analyze an extension of the primal-dual hybrid gradient method (PDHGM, also known as the Chambolle-Pock method) designed to solve problems with a nonlinear operator in the saddle term. Based on the idea of testing, we derive new step length parameter conditions for the convergence in infinite-dimensional Hilbert spaces and provide acceleration rules for suitably locally monotone problems. Importantly, we demonstrate linear convergence rates and prove global convergence in certain cases. We demonstrate the efficacy of these new step length rules on PDE-constrained optimization problems.

Related articles: Most relevant | Search more
arXiv:2304.09537 [math.OC] (Published 2023-04-19)
Global Convergence of Algorithms Based on Unions of Nonexpansive Maps
arXiv:2202.02914 [math.OC] (Published 2022-02-07)
Global convergence and optimality of the heavy ball method for non-convex optimization
arXiv:1805.12591 [math.OC] (Published 2018-05-31)
On Acceleration with Noise-Corrupted Gradients