arXiv Analytics

Sign in

arXiv:2009.10159 [math.OC]AbstractReferencesReviewsResources

Operator-valued formulas for Riemannian Gradient and Hessian and families of tractable metrics in optimization and machine learning

Du Nguyen

Published 2020-09-21Version 1

We provide an explicit formula for the Levi-Civita connection and Riemannian Hessian when the {\it tangent space} at each point of a Riemannian manifold is embedded in an inner product space with a non-constant metric. Together with a classical formula for projection, this allows us to evaluate Riemannian gradient and Hessian for several families of metric extending existing ones on classical manifolds: a family of metrics on Stiefel manifolds connecting both the constant and canonical ambient metrics with closed-form geodesics; a family of quotient metrics on a manifold of positive-semidefinite matrices of fixed rank, considered as a quotient of a product of Stiefel and positive-definite matrix manifold with affine-invariant metrics; a large family of new metrics on flag manifolds. We show in many instances, this method allows us to apply symbolic calculus to derive formulas for the Riemannian gradient and Hessian. The method greatly extends the list of potential metrics that could be used in manifold optimization and machine learning.

Related articles: Most relevant | Search more
arXiv:2212.06379 [math.OC] (Published 2022-12-13)
Self-adaptive algorithms for quasiconvex programming and applications to machine learning
arXiv:1407.1097 [math.OC] (Published 2014-07-04)
Robust Optimization using Machine Learning for Uncertainty Sets
arXiv:2010.00848 [math.OC] (Published 2020-10-02)
Nonsmoothness in Machine Learning: specific structure, proximal identification, and applications