By Topic

Feature selection via set cover

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
M. Dash ; Dept. of Inf. Syst. & Comput. Sci., Nat. Univ. of Singapore, Singapore

In pattern classification, features are used to define classes. Feature selection is a preprocessing process that searches for an “optimal” subset of features. The class separability is normally used as the basic feature selection criterion. Instead of maximizing the class separability, as in the literature, this work adopts a criterion aiming to maintain the discriminating power of the data describing its classes. In other words, the problem is formalized as that of finding the smallest set of features that is “consistent” in describing classes. We describe a multivariate measure of feature consistency. The new feature selection algorithm is based on Johnson's (1974) algorithm for set covering. Johnson's analysis implies that this algorithm runs in polynomial time, and outputs a consistent feature set whose size is within a log factor of the best possible. Our experiments show that its performance in practice is much better than this, and that it outperforms earlier methods using a similar amount of time

Published in:

Knowledge and Data Engineering Exchange Workshop, 1997. Proceedings

Date of Conference:

4 Nov 1997