arXiv Analytics

Sign in

arXiv:2406.09786 [math.OC]AbstractReferencesReviewsResources

Convergence analysis of a regularized Newton method with generalized regularization terms for convex optimization problems

Yuya Yamakawa, Nobuo Yamashita

Published 2024-06-14Version 1

In this paper, we present a regularized Newton method (RNM) with generalized regularization terms for an unconstrained convex optimization problem. The generalized regularization includes the quadratic, cubic, and elastic net regularization as a special case. Therefore, the proposed method is a general framework that includes not only the classical and cubic RNMs but also a novel RNM with the elastic net. We show that the proposed RNM has the global $\mathcal{O}(k^{-2})$ and local superlinear convergence, which are the same as those of the cubic RNM.

Related articles: Most relevant | Search more
arXiv:1710.07367 [math.OC] (Published 2017-10-19)
Convergence Analysis of the Frank-Wolfe Algorithm and Its Generalization in Banach Spaces
arXiv:1508.03899 [math.OC] (Published 2015-08-17)
Convergence Analysis of Algorithms for DC Programming
arXiv:1702.05142 [math.OC] (Published 2017-02-16)
Exact Diffusion for Distributed Optimization and Learning --- Part II: Convergence Analysis