arXiv Analytics

Sign in

arXiv:2202.11550 [stat.ML]AbstractReferencesReviewsResources

Robust Geometric Metric Learning

Antoine Collas, Arnaud Breloy, Guillaume Ginolhac, Chengfang Ren, Jean-Philippe Ovarlez

Published 2022-02-23Version 1

This paper proposes new algorithms for the metric learning problem. We start by noticing that several classical metric learning formulations from the literature can be viewed as modified covariance matrix estimation problems. Leveraging this point of view, a general approach, called Robust Geometric Metric Learning (RGML), is then studied. This method aims at simultaneously estimating the covariance matrix of each class while shrinking them towards their (unknown) barycenter. We focus on two specific costs functions: one associated with the Gaussian likelihood (RGML Gaussian), and one with Tyler's M -estimator (RGML Tyler). In both, the barycenter is defined with the Riemannian distance, which enjoys nice properties of geodesic convexity and affine invariance. The optimization is performed using the Riemannian geometry of symmetric positive definite matrices and its submanifold of unit determinant. Finally, the performance of RGML is asserted on real datasets. Strong performance is exhibited while being robust to mislabeled data.

Related articles:
arXiv:2405.06558 [stat.ML] (Published 2024-05-10)
Random matrix theory improved Fréchet mean of symmetric positive definite matrices
arXiv:2101.12416 [stat.ML] (Published 2021-01-29)
Covariance Prediction via Convex Optimization
arXiv:2003.13869 [stat.ML] (Published 2020-03-30)
ManifoldNorm: Extending normalizations on Riemannian Manifolds