arXiv Analytics

Sign in

arXiv:2311.13883 [cs.LG]AbstractReferencesReviewsResources

Leveraging Optimal Transport via Projections on Subspaces for Machine Learning Applications

Clément Bonet

Published 2023-11-23Version 1

Optimal Transport has received much attention in Machine Learning as it allows to compare probability distributions by exploiting the geometry of the underlying space. However, in its original formulation, solving this problem suffers from a significant computational burden. Thus, a meaningful line of work consists at proposing alternatives to reduce this burden while still enjoying its properties. In this thesis, we focus on alternatives which use projections on subspaces. The main such alternative is the Sliced-Wasserstein distance, which we first propose to extend to Riemannian manifolds in order to use it in Machine Learning applications for which using such spaces has been shown to be beneficial in the recent years. We also study sliced distances between positive measures in the so-called unbalanced OT problem. Back to the original Euclidean Sliced-Wasserstein distance between probability measures, we study the dynamic of gradient flows when endowing the space with this distance in place of the usual Wasserstein distance. Then, we investigate the use of the Busemann function, a generalization of the inner product in metric spaces, in the space of probability measures. Finally, we extend the subspace detour approach to incomparable spaces using the Gromov-Wasserstein distance.

Related articles: Most relevant | Search more
arXiv:2109.03469 [cs.LG] (Published 2021-09-08)
Understanding and Preparing Data of Industrial Processes for Machine Learning Applications
arXiv:2404.03082 [cs.LG] (Published 2024-04-03)
Machine Learning and Data Analysis Using Posets: A Survey
arXiv:2202.10723 [cs.LG] (Published 2022-02-22)
Sobolev Transport: A Scalable Metric for Probability Measures with Graph Metrics