arXiv Analytics

Sign in

arXiv:1208.4415 [cs.IT]AbstractReferencesReviewsResources

Distributed Channel Synthesis

Paul Cuff

Published 2012-08-22, updated 2013-08-20Version 3

Two familiar notions of correlation are rediscovered as the extreme operating points for distributed synthesis of a discrete memoryless channel, in which a stochastic channel output is generated based on a compressed description of the channel input. Wyner's common information is the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannon's mutual information. This work characterizes the optimal trade-off between the amount of common randomness used and the required rate of description. We also include a number of related derivations, including the effect of limited local randomness, rate requirements for secrecy, applications to game theory, and new insights into common information duality. Our proof makes use of a soft covering lemma, known in the literature for its role in quantifying the resolvability of a channel. The direct proof (achievability) constructs a feasible joint distribution over all parts of the system using a soft covering, from which the behavior of the encoder and decoder is inferred, with no explicit reference to joint typicality or binning. Of auxiliary interest, this work also generalizes and strengthens this soft covering tool.

Comments: To appear in IEEE Trans. on Information Theory (submitted Aug., 2012, accepted July, 2013), 26 pages, using IEEEtran.cls
Journal: IEEE Trans. on Inf. Theory, 59(11):7071-96, November, 2013
Categories: cs.IT, math.IT
Subjects: 94A15, H.1.1
Related articles: Most relevant | Search more
arXiv:1605.06396 [cs.IT] (Published 2016-05-20)
Soft Covering with High Probability
arXiv:1506.00193 [cs.IT] (Published 2015-05-31)
Gaussian Secure Source Coding and Wyner's Common Information
arXiv:2102.08157 [cs.IT] (Published 2021-02-16)
Lower bound on Wyner's Common Information