Skip to Main Content
An adaptive feature selection method based on mutual information, called AMIFS, is presented. AMIFS is an enhancement over Battiti's MIFS and MIFS-U methods. In AMIFS the tradeoff between eliminating irrelevance or redundancy is controlled adoptively, instead of using a fixed parameter. The mutual information is computed by discrete probabilities in the case of discrete features or by using an extended version of Fraser's algorithm in the case of continuous features. The performance of AMIFS is compared with that of MIFS and MIFS-U on artificial and benchmark datasets. The simulation results show that AMIFS outperforms both MIFS and MIFS-U, specially for high-dimensional data with many irrelevant and/or redundant features.