By Topic

On feature selection in a class of distribution-free pattern classifiers

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)

A feature-selection procedure is proposed for the class of distribution-free pattern classifiers [1], [2]. The selection procedure can be readily carried out on fixed (large) training samples using matrix inversion. If direct matrix inversion is to be avoided, the approximation method [4] or the stochastic-approximation procedure [2] can be applied to the training samples. The proposed procedure, aside from furnishing a statistical interpretation, has a mapping interpretation. It has the unique property of designing a pattern classifier under a single-performance criterion instead of the conventional division of receptor and categorizer. It enables the system to come closest to the minimum-risk ideal classifier. In particular, for two-class problems having normal distributions with equal covariance matrices, equal costs for misrecognition, and equal a priori probabilities, the proposed procedure yields the optimum Bayes procedure without the knowledge of the class distributions. Furthermore, the proposed feature-selection procedure is the same as that of the divergence computation. Experimental results are presented. They are considered satisfactory.

Published in:

Information Theory, IEEE Transactions on  (Volume:16 ,  Issue: 1 )