By Topic

Bias of error rates in linear discriminant analysis caused by feature selection and sample size

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
H. Schulerud ; Dept. of Inf., Oslo Univ., Norway

The holdout and leave-one-out error estimates for a two-class problem with multivariate normal distributions and common covariance are derived as a function of the number of feature candidates, classifier dimensionality, sample size and Mahalanobis distance, using Monte Carlo simulations. It is demonstrated that the leave-one-out error rate is a highly biased estimate of the true error if feature selection is performed on the same data before error estimation. This problem is especially pronounced when analyzing many features on a small data set. The holdout error is an almost unbiased estimate of the true error independent of the number of feature candidates

Published in:

Pattern Recognition, 2000. Proceedings. 15th International Conference on  (Volume:2 )

Date of Conference: