Skip to Main Content
Linear Support Vector Machines(SVMs) have broad application in supervised classification problem with high dimensional feature space, such as text classification, word sense disambiguation, email spam detection and etc.. Considering the large volume of available training data, efficient training algorithm for linear SVMs draws many attention from the research community in recent years. Cutting-plane based method is one of the state-of-the-art training algorithms for linear SVMs. Within this cutting-plane framework, the quadratic programming(QP) subproblem, which consists of boundary constraints and a single inequality constraint, need to be solved at each iteration. This step is one of the most time consuming tasks in the whole method. In the current software, the QP subproblems are usually solved by the interior point method. In order to improve the efficiency of the cutting-plane based training algorithm, we transform the inequality constraint to an equation by introducing the slack variable and propose using projection gradient algorithm to solve the transformed QP subproblem. Compared with the existing method, the new algorithm has the following advantages. Firstly, because the special structure information in the subproblem is used carefully, the efficiency of solving the subproblem can be improved significantly. Secondly, through projecting the variables to the bound constraints explicitly, the variables that are not related to support vectors can be identified directly. Therefore, the rounding techniques, which is a necessary step in the widely used interior point method based solvers, is not required anymore. Experimental results on several public data sets also show the effectiveness and efficiency of our new algorithm.