arXiv Analytics

Sign in

arXiv:1301.5942 [cs.IT]AbstractReferencesReviewsResources

Confidence Intervals for the Mutual Information

A. G. Stefani, J. B. Huber, C. Jardin, H. Sticht

Published 2013-01-25, updated 2013-01-28Version 2

By combining a bound on the absolute value of the difference of mutual information between two joint probablity distributions with a fixed variational distance, and a bound on the probability of a maximal deviation in variational distance between a true joint probability distribution and an empirical joint probability distribution, confidence intervals for the mutual information of two random variables with finite alphabets are established. Different from previous results, these intervals do not need any assumptions on the distribution and the sample size.

Related articles: Most relevant | Search more
arXiv:1703.04923 [cs.IT] (Published 2017-03-15)
Greedy-Merge Degrading has Optimal Power-Law
arXiv:1601.03439 [cs.IT] (Published 2016-01-13)
On the Exact Distribution of Mutual Information of Two-user MIMO MAC Based on Quotient Distribution of Wishart Matrices
arXiv:0704.1751 [cs.IT] (Published 2007-04-13, updated 2010-08-24)
Information Theoretic Proofs of Entropy Power Inequalities