By Topic

Mutual information and intrinsic dimensionality for feature selection

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Gómez, W. ; Inf. Technol. Lab., CINVESTAV-IPN, Victoria, Mexico ; Leija, L. ; Díaz-Pérez, A.

In this article we proposed a feature selection method based on mutual information (MI) and intrinsic dimensionality (ID) estimators. First, MI ranks the normalized feature space in accordance to minimal-redundancy-maximal-relevance (mRMR) criterion. Next, ID estimates the minimum number of features to represent the observed properties of the data. Two techniques of ID were tested: principal component analysis (PCA) and maximum likelihood estimator (MLE). Support vector machine (SVM) was used to classify five medical datasets. Receiver operating characteristics (ROC) analysis evaluated the classification performance before and after feature selection. Results showed that MI and ID are effective techniques for feature selection to reduce the classification error.

Published in:

Electrical Engineering Computing Science and Automatic Control (CCE), 2010 7th International Conference on

Date of Conference:

8-10 Sept. 2010