arXiv Analytics

Sign in

arXiv:1303.3235 [cs.IT]AbstractReferencesReviewsResources

On the Entropy of Couplings

Mladen Kovačević, Ivan Stanojević, Vojin Šenk

Published 2013-03-13, updated 2015-03-22Version 2

In this paper, some general properties of Shannon information measures are investigated over sets of probability distributions with restricted marginals. Certain optimization problems associated with these functionals are shown to be NP-hard, and their special cases are found to be essentially information-theoretic restatements of well-known computational problems, such as the SUBSET SUM and the 3-PARTITION. The notion of minimum entropy coupling is introduced and its relevance is demonstrated in information-theoretic, computational, and statistical contexts. Finally, a family of pseudometrics (on the space of discrete probability distributions) defined by these couplings is studied, in particular their relation to the total variation distance, and a new characterization of the conditional entropy is given.

Comments: 18 pages (single-column). Compared to v1, the material is reorganized, Section IV.C is removed (the results will possible appear elsewhere), Propositions 3.2 and 3.7 are added. Accepted for publication in Information and Computation
Categories: cs.IT, math.IT
Subjects: 94A17, 60E99, 68Q17
Related articles: Most relevant | Search more
arXiv:2107.01975 [cs.IT] (Published 2021-07-05)
The information loss of a stochastic map
arXiv:2404.02167 [cs.IT] (Published 2024-03-26)
A remark on conditional entropy
arXiv:1301.7504 [cs.IT] (Published 2013-01-31, updated 2013-07-16)
Improved Lower Bounds on the Total Variation Distance for the Poisson Approximation