arXiv Analytics

Sign in

arXiv:2304.04032 [math.OC]AbstractReferencesReviewsResources

A Riemannian Proximal Newton Method

Wutao Si, P. -A. Absil, Wen Huang, Rujun Jiang, Simon Vary

Published 2023-04-08Version 1

In recent years, the proximal gradient method and its variants have been generalized to Riemannian manifolds for solving optimization problems with an additively separable structure, i.e., $f + h$, where $f$ is continuously differentiable, and $h$ may be nonsmooth but convex with computationally reasonable proximal mapping. In this paper, we generalize the proximal Newton method to embedded submanifolds for solving the type of problem with $h(x) = \mu \|x\|_1$. The generalization relies on the Weingarten and semismooth analysis. It is shown that the Riemannian proximal Newton method has a local superlinear convergence rate under certain reasonable assumptions. Moreover, a hybrid version is given by concatenating a Riemannian proximal gradient method and the Riemannian proximal Newton method. It is shown that if the objective function satisfies the Riemannian KL property and the switch parameter is chosen appropriately, then the hybrid method converges globally and also has a local superlinear convergence rate. Numerical experiments on random and synthetic data are used to demonstrate the performance of the proposed methods.

Related articles: Most relevant | Search more
arXiv:1702.00709 [math.OC] (Published 2017-02-02)
IQN: An Incremental Quasi-Newton Method with Local Superlinear Convergence Rate
arXiv:1909.06065 [math.OC] (Published 2019-09-13)
Riemannian Proximal Gradient Methods
arXiv:1906.00506 [math.OC] (Published 2019-06-03)
DAve-QN: A Distributed Averaged Quasi-Newton Method with Local Superlinear Convergence Rate