arXiv Analytics

Sign in

arXiv:1205.6602 [cs.IT]AbstractReferencesReviewsResources

Analytical Bounds between Entropy and Error Probability in Binary Classifications

Bao-Gang Hu, Hong-Jie Xing

Published 2012-05-30Version 1

The existing upper and lower bounds between entropy and error probability are mostly derived from the inequality of the entropy relations, which could introduce approximations into the analysis. We derive analytical bounds based on the closed-form solutions of conditional entropy without involving any approximation. Two basic types of classification errors are investigated in the context of binary classification problems, namely, Bayesian and non-Bayesian errors. We theoretically confirm that Fano's lower bound is an exact lower bound for any types of classifier in a relation diagram of "error probability vs. conditional entropy". The analytical upper bounds are achieved with respect to the minimum prior probability, which are tighter than Kovalevskij's upper bound.

Comments: 7 Pages, 2 Figures, Maple code
Categories: cs.IT, math.IT
Related articles: Most relevant | Search more
arXiv:1604.01566 [cs.IT] (Published 2016-04-06)
Achievable Rates for Gaussian Degraded Relay Channels with Non-Vanishing Error Probabilities
arXiv:1604.07560 [cs.IT] (Published 2016-04-26)
Bounds on the Error Probability of Raptor Codes
arXiv:1701.02088 [cs.IT] (Published 2017-01-09)
On Achievable Rates of AWGN Energy-Harvesting Channels with Block Energy Arrival and Non-Vanishing Error Probabilities