arXiv Analytics

Sign in

arXiv:2107.01975 [cs.IT]AbstractReferencesReviewsResources

The information loss of a stochastic map

James Fullwood, Arthur J. Parzygnat

Published 2021-07-05Version 1

We provide a stochastic extension of the Baez-Fritz-Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we call `conditional information loss.' Although not functorial, these information measures are semi-functorial, a concept we introduce that is definable in any Markov category. We also introduce the notion of an `entropic Bayes' rule' for information measures, and we provide a characterization of conditional entropy in terms of this rule.

Related articles: Most relevant | Search more
arXiv:2303.07245 [cs.IT] (Published 2023-03-13, updated 2023-10-30)
Concentration without Independence via Information Measures
arXiv:1303.3235 [cs.IT] (Published 2013-03-13, updated 2015-03-22)
On the Entropy of Couplings
arXiv:2404.02167 [cs.IT] (Published 2024-03-26)
A remark on conditional entropy