arXiv Analytics

Sign in

arXiv:2001.06399 [cs.IT]AbstractReferencesReviewsResources

Robust Generalization via $α$-Mutual Information

Amedeo Roberto Esposito, Michael Gastpar, Ibrahim Issa

Published 2020-01-14Version 1

The aim of this work is to provide bounds connecting two probability measures of the same event using R\'enyi $\alpha$-Divergences and Sibson's $\alpha$-Mutual Information, a generalization of respectively the Kullback-Leibler Divergence and Shannon's Mutual Information. A particular case of interest can be found when the two probability measures considered are a joint distribution and the corresponding product of marginals (representing the statistically independent scenario). In this case, a bound using Sibson's $\alpha-$Mutual Information is retrieved, extending a result involving Maximal Leakage to general alphabets. These results have broad applications, from bounding the generalization error of learning algorithms to the more general framework of adaptive data analysis, provided that the divergences and/or information measures used are amenable to such an analysis ({\it i.e.,} are robust to post-processing and compose adaptively). The generalization error bounds are derived with respect to high-probability events but a corresponding bound on expected generalization error is also retrieved.

Comments: Accepted to IZS2020. arXiv admin note: substantial text overlap with arXiv:1912.01439
Categories: cs.IT, cs.LG, math.IT
Related articles: Most relevant | Search more
arXiv:1912.01439 [cs.IT] (Published 2019-12-01)
Generalization Error Bounds Via Rényi-, $f$-Divergences and Maximal Leakage
arXiv:2205.07050 [cs.IT] (Published 2022-05-14)
Generalization error bounds for DECONET: a deep unfolded network for analysis Compressive Sensing
arXiv:0911.2784 [cs.IT] (Published 2009-11-14, updated 2011-10-05)
On Bregman Distances and Divergences of Probability Measures