Abstract:
Feature selection has been an active research area in the past decades. The objective of feature selection includes improving prediction accuracy, accelerating classifica...Show MoreNotes: Please be advised that the paper you have accessed is a draft of the final paper that was presented at the conference. This draft will be replaced with the final paper shortly.
Metadata
Abstract:
Feature selection has been an active research area in the past decades. The objective of feature selection includes improving prediction accuracy, accelerating classification speed, and gaining better understanding of the features. Feature selection methods are often divided into three categories: filter methods, wrapper methods, and embedded methods. In this paper, we propose a simple leave-one-feature-out wrapper method for feature selection. The main goal is to improve prediction accuracy. A distinctive feature of our method is that the number of cross validation trainings is a user controlled constant multiple of the number of features. The strategy can be applied to any classifiers and the idea is intuitive. Given the wide availability of off-the-shelf machine learning software packages and computing power, the proposed simple method may be particularly attractive to practitioners. Numerical experiments are included to show the simple usage and the effectiveness of the method.
Notes: Please be advised that the paper you have accessed is a draft of the final paper that was presented at the conference. This draft will be replaced with the final paper shortly.
Date of Conference: 16-18 December 2013
Date Added to IEEE Xplore: 24 February 2014
ISBN Information: