arXiv Analytics

Sign in

arXiv:2411.07661 [math.OC]AbstractReferencesReviewsResources

A preconditioned second-order convex splitting algorithm with a difference of varying convex functions and line search

Xinhua Shen, Zaijiu Shang, Hongpeng Sun

Published 2024-11-12Version 1

This paper introduces a preconditioned convex splitting algorithm enhanced with line search techniques for nonconvex optimization problems. The algorithm utilizes second-order backward differentiation formulas (BDF) for the implicit and linear components and the Adams-Bashforth scheme for the nonlinear and explicit parts of the gradient flow in variational functions. The proposed algorithm, resembling a generalized difference-of-convex-function approach, involves a changing set of convex functions in each iteration. It integrates the Armijo line search strategy to improve performance. The study also discusses classical preconditioners such as symmetric Gauss-Seidel, Jacobi, and Richardson within this context. The global convergence of the algorithm is established through the Kurdyka-{\L}ojasiewicz properties, ensuring convergence within a finite number of preconditioned iterations. Numerical experiments demonstrate the superiority of the proposed second-order convex splitting with line search over conventional difference-of-convex-function algorithms.

Related articles: Most relevant | Search more
arXiv:2205.06860 [math.OC] (Published 2022-05-13)
Four Operator Splitting via a Forward-Backward-Half-Forward Algorithm with Line Search
arXiv:1603.06772 [math.OC] (Published 2016-03-22)
Line Search for Averaged Operator Iteration
arXiv:2405.06824 [math.OC] (Published 2024-05-10)
A Quasi-Newton Primal-Dual Algorithm with Line Search