System Maintenance:
There may be intermittent impact on performance while updates are in progress. We apologize for the inconvenience.
By Topic

On dimensionality, sample size, and classification error of nonparametric linear classification algorithms

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Raudys, S. ; Inst. of Math. & Inf., Vilnius Univ., Lithuania

This paper compares two nonparametric linear classification algorithms $the zero empirical error classifier and the maximum margin classifier - with parametric linear classifiers designed to classify multivariate Gaussian populations. Formulae and a table for the mean expected probability of misclassification MEPN are presented. They show that the classification error is mainly determined by N/p, a learning-set size/dimensionality ratio. However, the influences of learning-set size on the generalization error of parametric and nonparametric linear classifiers are quite different. Under certain conditions the nonparametric approach allows us to obtain reliable rules, even in cases where the number of features is larger than the number of training vectors

Published in:

Pattern Analysis and Machine Intelligence, IEEE Transactions on  (Volume:19 ,  Issue: 6 )