arXiv Analytics

Sign in

arXiv:cs/0701050 [cs.IT]AbstractReferencesReviewsResources

A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information

Olivier Rioul

Published 2007-01-08, updated 2007-04-13Version 2

While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon's entropy power inequality (EPI) seems to be an exception: available information theoretic proofs of the EPI hinge on integral representations of differential entropy using either Fisher's information (FI) or minimum mean-square error (MMSE). In this paper, we first present a unified view of proofs via FI and MMSE, showing that they are essentially dual versions of the same proof, and then fill the gap by providing a new, simple proof of the EPI, which is solely based on the properties of mutual information and sidesteps both FI or MMSE representations.

Comments: 5 pages, accepted for presentation at the IEEE International Symposium on Information Theory 2007
Categories: cs.IT, math.IT
Related articles: Most relevant | Search more
arXiv:0704.1751 [cs.IT] (Published 2007-04-13, updated 2010-08-24)
Information Theoretic Proofs of Entropy Power Inequalities
arXiv:1607.02330 [cs.IT] (Published 2016-07-08)
Two Measures of Dependence
arXiv:1510.02330 [cs.IT] (Published 2015-10-08)
On Maximal Correlation, Mutual Information and Data Privacy