By Topic

AMIFS: adaptive feature selection by using mutual information

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Tesmer, M. ; Dept. of Electr. Eng., Chile Univ., Santiago, Chile ; Estevez, P.A.

An adaptive feature selection method based on mutual information, called AMIFS, is presented. AMIFS is an enhancement over Battiti's MIFS and MIFS-U methods. In AMIFS the tradeoff between eliminating irrelevance or redundancy is controlled adoptively, instead of using a fixed parameter. The mutual information is computed by discrete probabilities in the case of discrete features or by using an extended version of Fraser's algorithm in the case of continuous features. The performance of AMIFS is compared with that of MIFS and MIFS-U on artificial and benchmark datasets. The simulation results show that AMIFS outperforms both MIFS and MIFS-U, specially for high-dimensional data with many irrelevant and/or redundant features.

Published in:

Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on  (Volume:1 )

Date of Conference:

25-29 July 2004