arXiv Analytics

Sign in

arXiv:2202.03951 [cs.IT]AbstractReferencesReviewsResources

On Sibson's $α$-Mutual Information

Amedeo Roberto Esposito, Adrien Vandenbroucque, Michael Gastpar

Published 2022-02-08Version 1

We explore a family of information measures that stems from R\'enyi's $\alpha$-Divergences with $\alpha<0$. In particular, we extend the definition of Sibson's $\alpha$-Mutual Information to negative values of $\alpha$ and show several properties of these objects. Moreover, we highlight how this family of information measures is related to functional inequalities that can be employed in a variety of fields, including lower-bounds on the Risk in Bayesian Estimation Procedures.

Related articles: Most relevant | Search more
arXiv:2102.00720 [cs.IT] (Published 2021-02-01)
On conditional Sibson's $α$-Mutual Information
arXiv:1907.06992 [cs.IT] (Published 2019-07-10)
The Design of Mutual Information
arXiv:1510.02330 [cs.IT] (Published 2015-10-08)
On Maximal Correlation, Mutual Information and Data Privacy