By Topic

Error estimation in pattern recognition via L_\alpha -distance between posterior density functions

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)

TheL^{ alpha}-distance between posterior density functions (PDF's) is proposed as a separability measure to replace the probability of error as a criterion for feature extraction in pattern recognition. Upper and lower bounds on Bayes error are derived foralpha > 0. Ifalpha = 1, the lower and upper bounds coincide; an increase (or decrease) inalphaloosens these bounds. Foralpha = 2, the upper bound equals the best commonly used bound and is equal to the asymptotic probability of error of the first nearest neighbor classifier. The case whenalpha = 1is used for estimation of the probability of error in different problem situations, and a comparison is made with other methods. It is shown how unclassified samples may also be used to improve the variance of the estimated error. For the family of exponential probability density functions (pdf's), the relation between the distance of a sample from the decision boundary and its contribution to the error is derived. In the nonparametric case, a consistent estimator is discussed which is computationally more efficient than estimators based on Parzen's estimation. A set of computer simulation experiments are reported to demonstrate the statistical advantages of the separability measure withalpha = 1when used in an error estimation scheme.

Published in:

Information Theory, IEEE Transactions on  (Volume:22 ,  Issue: 1 )