By Topic

A novel smoothing 1-norm SVM for classification and regression

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Yalu Liu ; Dept. of Math. & Phys., Beijing Inst. of Petrochem. Technol., Beijing, China ; Ruopeng Wang

The standard 2-norm support vector machine (SVM for short) is known for its good performance in classification and regression problems. In this paper, the 1-norm support vector machine is considered and a novel smoothing function method for Support Vector Classification(SVC) and Regression (SVR) are proposed in an attempt to overcome some drawbacks of the former methods which are complex, subtle, and sometimes difficult to implement. First, using Karush-Kuhn-Tucker complementary condition in optimization theory, unconstrained non-differentiable optimization model is built. Then the smooth approximation algorithm basing on differentiable function is given. Finally, the paper trains the data sets with standard unconstraint optimization method. This algorithm is fast and insensitive to the initial point. Theory analysis and numerical results illustrate that the smoothing function method for SVMs are feasible and effective.

Published in:

Computer Science and Education (ICCSE), 2010 5th International Conference on

Date of Conference:

24-27 Aug. 2010