Cart (Loading....) | Create Account
Close category search window
 

Using Projection Gradient Method to Train Linear Support Vector Machines

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Lingfeng Niu ; CAS Res. Center on Fictitious Econ. & Data Sci., Grad. Univ. of Chinese Acad. of Sci., Beijing, China ; Yong Shi

Linear Support Vector Machines(SVMs) have broad application in supervised classification problem with high dimensional feature space, such as text classification, word sense disambiguation, email spam detection and etc.. Considering the large volume of available training data, efficient training algorithm for linear SVMs draws many attention from the research community in recent years. Cutting-plane based method is one of the state-of-the-art training algorithms for linear SVMs. Within this cutting-plane framework, the quadratic programming(QP) subproblem, which consists of boundary constraints and a single inequality constraint, need to be solved at each iteration. This step is one of the most time consuming tasks in the whole method. In the current software, the QP subproblems are usually solved by the interior point method. In order to improve the efficiency of the cutting-plane based training algorithm, we transform the inequality constraint to an equation by introducing the slack variable and propose using projection gradient algorithm to solve the transformed QP subproblem. Compared with the existing method, the new algorithm has the following advantages. Firstly, because the special structure information in the subproblem is used carefully, the efficiency of solving the subproblem can be improved significantly. Secondly, through projecting the variables to the bound constraints explicitly, the variables that are not related to support vectors can be identified directly. Therefore, the rounding techniques, which is a necessary step in the widely used interior point method based solvers, is not required anymore. Experimental results on several public data sets also show the effectiveness and efficiency of our new algorithm.

Published in:

Web Intelligence and Intelligent Agent Technology (WI-IAT), 2010 IEEE/WIC/ACM International Conference on  (Volume:3 )

Date of Conference:

Aug. 31 2010-Sept. 3 2010

Need Help?


IEEE Advancing Technology for Humanity About IEEE Xplore | Contact | Help | Terms of Use | Nondiscrimination Policy | Site Map | Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest professional association for the advancement of technology.
© Copyright 2014 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.