Skip to Main Content
Exhaustive feature selection algorithms are optimal because all possible combinations of features are tested against a predetermined criterion. Suboptimal algorithms that trade performance for speed by considering only a subset of all feature combinations are generally preferred. An implementation of the exhaustive search feature selection (ESFS) method is described for the Bayes Gaussian statistics. The algorithm significantly reduces the computational and time requirements normally associated with optimal algorithms. The performance of this algorithm is compared to that of two suboptimal algorithms-forward sequential features selection and stepwise linear discriminant analysis. Results show that this implementation provides a moderate improvement in classification accuracy and is well suited for evaluating the performance of suboptimal algorithms.