Skip to Main Content
Feature selection has been an important issue in recent decades to determine the most relevant features according to a given classification problem. Numerous methods have emerged that take into account support vector machines (SVMs) in the selection process. Such approaches are powerful but often complex and costly. In this paper, we propose new feature selection methods based on two criteria designed for the optimization of SVM: kernel target alignment and kernel class separability. We demonstrate how these two measures, when fully expressed, can build efficient and simple methods, easily applicable to multiclass problems and iteratively computable with minimal memory requirements. An extensive experimental study is conducted both on artificial and real-world datasets to compare the proposed methods to state-of-the-art feature selection algorithms. The results demonstrate the relevance of the proposed methods both in terms of performance and computational cost.