arXiv Analytics

Sign in

arXiv:2109.14206 [stat.ML]AbstractReferencesReviewsResources

Exact Statistical Inference for the Wasserstein Distance by Selective Inference

Vo Nguyen Le Duy, Ichiro Takeuchi

Published 2021-09-29, updated 2022-01-20Version 3

In this paper, we study statistical inference for the Wasserstein distance, which has attracted much attention and has been applied to various machine learning tasks. Several studies have been proposed in the literature, but almost all of them are based on asymptotic approximation and do not have finite-sample validity. In this study, we propose an exact (non-asymptotic) inference method for the Wasserstein distance inspired by the concept of conditional Selective Inference (SI). To our knowledge, this is the first method that can provide a valid confidence interval (CI) for the Wasserstein distance with finite-sample coverage guarantee, which can be applied not only to one-dimensional problems but also to multi-dimensional problems. We evaluate the performance of the proposed method on both synthetic and real-world datasets.

Related articles: Most relevant | Search more
arXiv:1805.11897 [stat.ML] (Published 2018-05-30)
Differential Properties of Sinkhorn Approximation for Learning with Wasserstein Distance
arXiv:2401.11562 [stat.ML] (Published 2024-01-21)
Enhancing selectivity using Wasserstein distance based reweighing
arXiv:2103.01678 [stat.ML] (Published 2021-03-02)
Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)