arXiv Analytics

Sign in

arXiv:1909.09501 [cs.LG]AbstractReferencesReviewsResources

Trivializations for Gradient-Based Optimization on Manifolds

Mario Lezcano-Casado

Published 2019-09-20Version 1

We introduce a framework to study the transformation of problems with manifold constraints into unconstrained problems through parametrizations in terms of a Euclidean space. We call these parametrizations "trivializations". We prove conditions under which a trivialization is sound in the context of gradient-based optimization and we show how two large families of trivializations have overall favorable properties, but also suffer from a performance issue. We then introduce "dynamic trivializations", which solve this problem, and we show how these form a family of optimization methods that lie between trivializations and Riemannian gradient descent, and combine the benefits of both of them. We then show how to implement these two families of trivializations in practice for different matrix manifolds. To this end, we prove a formula for the gradient of the exponential of matrices, which can be of practical interest on its own. Finally, we show how dynamic trivializations improve the performance of existing methods on standard tasks designed to test long-term memory within neural networks.

Related articles: Most relevant | Search more
arXiv:2405.06312 [cs.LG] (Published 2024-05-10)
FedGCS: A Generative Framework for Efficient Client Selection in Federated Learning via Gradient-based Optimization
Zhiyuan Ning et al.
arXiv:2004.08763 [cs.LG] (Published 2020-04-19)
Model-Predictive Control via Cross-Entropy and Gradient-Based Optimization
arXiv:2402.01879 [cs.LG] (Published 2024-02-02, updated 2024-10-02)
$σ$-zero: Gradient-based Optimization of $\ell_0$-norm Adversarial Examples