arXiv:1907.06992 [cs.IT]AbstractReferencesReviewsResources
The Design of Mutual Information
Published 2019-07-10Version 1
We derive the functional form of mutual information (MI) from a set of design criteria and a principle of maximal sufficiency. The (MI) between two sets of propositions is a global quantifier of correlations and is implemented as a tool for ranking joint probability distributions with respect to said correlations. The derivation parallels the derivations of relative entropy with an emphasis on the behavior of independent variables. By constraining the functional $I$ according to special cases, we arrive at its general functional form and hence establish a clear meaning behind its definition. We also discuss the notion of sufficiency and offer a new definition which broadens its applicability.
Related articles: Most relevant | Search more
arXiv:2202.03951 [cs.IT] (Published 2022-02-08)
On Sibson's $α$-Mutual Information
A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information
Confidence Intervals for the Mutual Information