arXiv Analytics

Sign in

arXiv:2006.05240 [stat.ML]AbstractReferencesReviewsResources

How Robust is the Median-of-Means? Concentration Bounds in Presence of Outliers

Pierre Laforgue, Guillaume Staerman, Stephan Clémençon

Published 2020-06-09Version 1

In contrast to the empirical mean, the Median-of-Means (MoM) is an estimator of the mean $\theta$ of a square integrable r.v. $Z$, around which accurate nonasymptotic confidence bounds can be built, even when $Z$ does not exhibit a sub-Gaussian tail behavior. Because of the high confidence it achieves when applied to heavy-tailed data, MoM has recently found applications in statistical learning, in order to design training procedures that are not sensitive to atypical nor corrupted observations. For the first time, we provide concentration bounds for the MoM estimator in presence of outliers, that depend explicitly on the fraction of contaminated data present in the sample. These results are also extended to "Medians-of-$U$-statistics'' (i.e. averages over tuples of observations), and are shown to furnish generalization guarantees for pairwise learning techniques (e.g. ranking, metric learning) based on contaminated training data. Beyond the theoretical analysis carried out, numerical results are displayed, that provide strong empirical evidence of the robustness properties claimed by the learning rate bounds established.

Related articles: Most relevant | Search more
arXiv:2105.14035 [stat.ML] (Published 2021-05-28)
DeepMoM: Robust Deep Learning With Median-of-Means
arXiv:1802.04784 [stat.ML] (Published 2018-02-13)
MONK -- Outlier-Robust Mean Embedding Estimation by Median-of-Means
arXiv:1905.10155 [stat.ML] (Published 2019-05-24)
Concentration bounds for linear Monge mapping estimation and optimal transport domain adaptation