arXiv Analytics

Sign in

arXiv:1607.02330 [cs.IT]AbstractReferencesReviewsResources

Two Measures of Dependence

Amos Lapidoth, Christoph Pfister

Published 2016-07-08Version 1

Motivated by a distributed task-encoding problem, two closely related families of dependence measures are introduced. They are based on the R\'enyi divergence of order {\alpha} and the relative {\alpha}-entropy, respectively, and both reduce to the mutual information when the parameter {\alpha} is one. Their properties are studied and it is shown that the first measure shares many properties with mutual information, including the data-processing inequality. The second measure does not satisfy the data-processing inequality, but it appears naturally in the context of distributed task encoding.

Comments: 5 pages; submitted to ICSEE 2016
Categories: cs.IT, math.IT
Related articles: Most relevant | Search more
arXiv:cs/0701050 [cs.IT] (Published 2007-01-08, updated 2007-04-13)
A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information
arXiv:1301.5942 [cs.IT] (Published 2013-01-25, updated 2013-01-28)
Confidence Intervals for the Mutual Information
arXiv:1607.01184 [cs.IT] (Published 2016-07-05)
Calculation of mutual information for nonlinear communication channel at large SNR