arXiv Analytics

Sign in

arXiv:1907.06992 [cs.IT]AbstractReferencesReviewsResources

The Design of Mutual Information

Nicholas Carrara

Published 2019-07-10Version 1

We derive the functional form of mutual information (MI) from a set of design criteria and a principle of maximal sufficiency. The (MI) between two sets of propositions is a global quantifier of correlations and is implemented as a tool for ranking joint probability distributions with respect to said correlations. The derivation parallels the derivations of relative entropy with an emphasis on the behavior of independent variables. By constraining the functional $I$ according to special cases, we arrive at its general functional form and hence establish a clear meaning behind its definition. We also discuss the notion of sufficiency and offer a new definition which broadens its applicability.

Related articles: Most relevant | Search more
arXiv:2202.03951 [cs.IT] (Published 2022-02-08)
On Sibson's $α$-Mutual Information
arXiv:cs/0701050 [cs.IT] (Published 2007-01-08, updated 2007-04-13)
A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information
arXiv:1301.5942 [cs.IT] (Published 2013-01-25, updated 2013-01-28)
Confidence Intervals for the Mutual Information