arXiv:2211.10942 [math.OC]AbstractReferencesReviewsResources
On the convergence analysis of DCA
Published 2022-11-20Version 1
In this paper, we propose a clean and general proof framework to establish the convergence analysis of the Difference-of-Convex (DC) programming algorithm (DCA) for both standard DC program and convex constrained DC program. We first discuss suitable assumptions for the well-definiteness of DCA. Then, we focus on the convergence analysis of DCA, in particular, the global convergence of the sequence $\{x^k\}$ generated by DCA under the Lojasiewicz subgradient inequality and the Kurdyka-Lojasiewicz property respectively. Moreover, the convergence rate for the sequences $\{f(x^k)\}$ and $\{\|x^k-x^*\|\}$ are also investigated. We hope that the proof framework presented in this article will be a useful tool to conveniently establish the convergence analysis for many variants of DCA and new DCA-type algorithms.