arXiv Analytics

Sign in

arXiv:1302.1461 [cs.IT]AbstractReferencesReviewsResources

Stopping Criteria for Iterative Decoding based on Mutual Information

Jinhong Wu, Branimir R. Vojcic, Jia Sheng

Published 2013-02-06Version 1

In this paper we investigate stopping criteria for iterative decoding from a mutual information perspective. We introduce new iteration stopping rules based on an approximation of the mutual information between encoded bits and decoder soft output. The first type stopping rule sets a threshold value directly on the approximated mutual information for terminating decoding. The threshold can be adjusted according to the expected bit error rate. The second one adopts a strategy similar to that of the well known cross-entropy stopping rule by applying a fixed threshold on the ratio of a simple metric obtained after each iteration over that of the first iteration. Compared with several well known stopping rules, the new methods achieve higher efficiency.

Comments: The Asilomar Conference on Signals, Systems, and Computers, Monterey, CA, Nov., 2012
Categories: cs.IT, math.IT
Related articles: Most relevant | Search more
arXiv:1704.05199 [cs.IT] (Published 2017-04-18)
Mutual Information, Relative Entropy and Estimation Error in Semi-martingale Channels
arXiv:cs/0701050 [cs.IT] (Published 2007-01-08, updated 2007-04-13)
A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information
arXiv:1510.02330 [cs.IT] (Published 2015-10-08)
On Maximal Correlation, Mutual Information and Data Privacy