By Topic

Information transfer through classifiers and its relation to probability of error

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
D. Erdogmus ; Comput. NeuroEng. Lab., Florida Univ., Gainesville, FL, USA ; J. C. Principe

Fano's (1961) bound identifies a lower bound for the classification error probability and indicates how the information transfer through classifier affects its performance. It was an important step towards linking the information theory and pattern recognition. In this paper, a family of lower bounds is derived using Renyi's entropy, which yields Fano's lower bound as a special case. Using a different set of entropy orders, Renyi's definition also allows the construction a family of upper bounds for the probability of error. This is impossible using Shannon's definition of entropy. Further analysis to obtain the tightest lower and upper bounds revealed the fact that Fano's bound is indeed the tightest lower bound, and the upper bounds become tighter as the entropy order approaches to one from below. Numerical evaluations of the bounds are presented for three digital modulation schemes under AWGN channel

Published in:

Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on  (Volume:1 )

Date of Conference:

2001