arXiv Analytics

Sign in

arXiv:2206.09194 [stat.ML]AbstractReferencesReviewsResources

Efficient Aggregated Kernel Tests using Incomplete $U$-statistics

Antonin Schrab, Ilmun Kim, Benjamin Guedj, Arthur Gretton

Published 2022-06-18Version 1

We propose a series of computationally efficient, nonparametric tests for the two-sample, independence and goodness-of-fit problems, using the Maximum Mean Discrepancy (MMD), Hilbert Schmidt Independence Criterion (HSIC), and Kernel Stein Discrepancy (KSD), respectively. Our test statistics are incomplete $U$-statistics, with a computational cost that interpolates between linear time in the number of samples, and quadratic time, as associated with classical $U$-statistic tests. The three proposed tests aggregate over several kernel bandwidths to detect departures from the null on various scales: we call the resulting tests MMDAggInc, HSICAggInc and KSDAggInc. For the test thresholds, we derive a quantile bound for wild bootstrapped incomplete $U$- statistics, which is of independent interest. We derive uniform separation rates for MMDAggInc and HSICAggInc, and quantify exactly the trade-off between computational efficiency and the attainable rates: this result is novel for tests based on incomplete $U$-statistics, to our knowledge. We further show that in the quadratic-time case, the wild bootstrap incurs no penalty to test power over more widespread permutation-based approaches, since both attain the same minimax optimal rates (which in turn match the rates that use oracle quantiles). We support our claims with numerical experiments on the trade-off between computational efficiency and test power. In the three testing frameworks, we observe that our proposed linear-time aggregated tests obtain higher power than current state-of-the-art linear-time kernel tests.

Related articles: Most relevant | Search more
arXiv:2110.15073 [stat.ML] (Published 2021-10-28, updated 2022-06-22)
MMD Aggregated Two-Sample Test
arXiv:2202.00824 [stat.ML] (Published 2022-02-02)
KSD Aggregated Goodness-of-fit Test
arXiv:1909.05097 [stat.ML] (Published 2019-09-06)
Spectral Non-Convex Optimization for Dimension Reduction with Hilbert-Schmidt Independence Criterion