arXiv Analytics

Sign in

arXiv:1303.6409 [cs.IT]AbstractReferencesReviewsResources

Information Measures for Deterministic Input-Output Systems

Bernhard C. Geiger, Gernot Kubin

Published 2013-03-26, updated 2013-04-17Version 2

In this work the information loss in deterministic, memoryless systems is investigated by evaluating the conditional entropy of the input random variable given the output random variable. It is shown that for a large class of systems the information loss is finite, even if the input is continuously distributed. Based on this finiteness, the problem of perfectly reconstructing the input is addressed and Fano-type bounds between the information loss and the reconstruction error probability are derived. For systems with infinite information loss a relative measure is defined and shown to be tightly related to R\'{e}nyi information dimension. Employing another Fano-type argument, the reconstruction error probability is bounded by the relative information loss from below. In view of developing a system theory from an information-theoretic point-of-view, the theoretical results are illustrated by a few example systems, among them a multi-channel autocorrelation receiver.

Related articles: Most relevant | Search more
arXiv:2303.07245 [cs.IT] (Published 2023-03-13, updated 2023-10-30)
Concentration without Independence via Information Measures
arXiv:2202.03951 [cs.IT] (Published 2022-02-08)
On Sibson's $α$-Mutual Information
arXiv:1404.6810 [cs.IT] (Published 2014-04-27, updated 2014-11-28)
Information Measures: the Curious Case of the Binary Alphabet