arXiv Analytics

Sign in

arXiv:2101.01792 [stat.ML]AbstractReferencesReviewsResources

Minibatch optimal transport distances; analysis and applications

Kilian Fatras, Younes Zine, Szymon Majewski, Rémi Flamary, Rémi Gribonval, Nicolas Courty

Published 2021-01-05Version 1

Optimal transport distances have become a classic tool to compare probability distributions and have found many applications in machine learning. Yet, despite recent algorithmic developments, their complexity prevents their direct use on large scale datasets. To overcome this challenge, a common workaround is to compute these distances on minibatches i.e. to average the outcome of several smaller optimal transport problems. We propose in this paper an extended analysis of this practice, which effects were previously studied in restricted cases. We first consider a large variety of Optimal Transport kernels. We notably argue that the minibatch strategy comes with appealing properties such as unbiased estimators, gradients and a concentration bound around the expectation, but also with limits: the minibatch OT is not a distance. To recover some of the lost distance axioms, we introduce a debiased minibatch OT function and study its statistical and optimisation properties. Along with this theoretical analysis, we also conduct empirical experiments on gradient flows, generative adversarial networks (GANs) or color transfer that highlight the practical interest of this strategy.

Related articles: Most relevant | Search more
arXiv:1209.3079 [stat.ML] (Published 2012-09-14)
Signal Recovery in Unions of Subspaces with Applications to Compressive Imaging
arXiv:1911.02728 [stat.ML] (Published 2019-11-07)
Auto-encoding graph-valued data with applications to brain connectomes
arXiv:1904.08548 [stat.ML] (Published 2019-04-18)
A New Class of Time Dependent Latent Factor Models with Applications