Skip to Main Content
Support Vector Machines (SVMs) are popular for pattern classification. However, training a SVM requires large memory and high processing time, especially for large datasets, which limits their applications. To speed up their training, we present a new efficient support vector selection method based on ensemble margin, a key concept in ensemble classifiers. This algorithm exploits a new version of the margin of an ensemble-based classification and selects the smallest margin instances as support vectors. Our experimental results show that our method reduces training set size significantly without degrading the performance of the resulting SVMs classifiers.