arXiv Analytics

Sign in

arXiv:1912.01439 [cs.IT]AbstractReferencesReviewsResources

Generalization Error Bounds Via Rényi-, $f$-Divergences and Maximal Leakage

Amedeo Roberto Esposito, Michael Gastpar, Ibrahim Issa

Published 2019-12-01Version 1

In this work, the probability of an event under some joint distribution is bounded by measuring it with the product of the marginals instead (which is typically easier to analyze) together with a measure of the dependence between the two random variables. These results find applications in adaptive data analysis, where multiple dependencies are introduced and in learning theory, where they can be employed to bound the generalization error of a learning algorithm. Bounds are given in terms of $\alpha-$Divergence, Sibson's Mutual Information and $f-$Divergence. A case of particular interest is the Maximal Leakage (or Sibson's Mutual Information of order infinity) since this measure is robust to post-processing and composes adaptively. This bound can also be seen as a generalization of classical bounds, such as Hoeffding's and McDiarmid's inequalities, to the case of dependent random variables.

Comments: arXiv admin note: text overlap with arXiv:1903.01777
Categories: cs.IT, cs.LG, math.IT, math.PR
Related articles: Most relevant | Search more
arXiv:2205.07050 [cs.IT] (Published 2022-05-14)
Generalization error bounds for DECONET: a deep unfolded network for analysis Compressive Sensing
arXiv:1010.3613 [cs.IT] (Published 2010-10-18)
The Common Information of N Dependent Random Variables
arXiv:2004.08035 [cs.IT] (Published 2020-04-17)
A Case for Maximal Leakage as a Side Channel Leakage Metric