arXiv:cs/0701050 [cs.IT]AbstractReferencesReviewsResources
A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information
Published 2007-01-08, updated 2007-04-13Version 2
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, Shannon's entropy power inequality (EPI) seems to be an exception: available information theoretic proofs of the EPI hinge on integral representations of differential entropy using either Fisher's information (FI) or minimum mean-square error (MMSE). In this paper, we first present a unified view of proofs via FI and MMSE, showing that they are essentially dual versions of the same proof, and then fill the gap by providing a new, simple proof of the EPI, which is solely based on the properties of mutual information and sidesteps both FI or MMSE representations.