arXiv:1607.02330 [cs.IT]AbstractReferencesReviewsResources
Two Measures of Dependence
Amos Lapidoth, Christoph Pfister
Published 2016-07-08Version 1
Motivated by a distributed task-encoding problem, two closely related families of dependence measures are introduced. They are based on the R\'enyi divergence of order {\alpha} and the relative {\alpha}-entropy, respectively, and both reduce to the mutual information when the parameter {\alpha} is one. Their properties are studied and it is shown that the first measure shares many properties with mutual information, including the data-processing inequality. The second measure does not satisfy the data-processing inequality, but it appears naturally in the context of distributed task encoding.
Comments: 5 pages; submitted to ICSEE 2016
Related articles: Most relevant | Search more
A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information
Confidence Intervals for the Mutual Information
arXiv:1607.01184 [cs.IT] (Published 2016-07-05)
Calculation of mutual information for nonlinear communication channel at large SNR