By Topic

A regression approach to LS-SVM and sparse realization based on fast subset selection

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Jingjing Zhang ; Sch. of Electron., Electr. Eng. & Comput. Sci., Queen''s Univ. Belfast, Belfast, UK ; Kang Li ; Irwin, G.W. ; Wanqing Zhao

The Least Squares Support Vector Machine (LS-SVM) is a modified SVM with a ridge regression cost function and equality constraints. It has been successfully applied in many classification problems. But, the common issue for LS-SVM is that it lacks sparseness, which is a serious drawback in its applications. To tackle this problem, a fast approach is proposed in this paper for developing sparse LS-SVM. First, a new regression solution is proposed for the LS-SVM which optimizes the same objective function for the conventional solution. Based on this, a new subset selection method is then adopted to realize the sparse approximation. Simulation results on different benchmark datasets i.e. Checkerboard, two Gaussian datasets, show that the proposed solution can achieve better objective value than conventional LS-SVM, and the proposed approach can achieve a more sparse LS-SVM than the conventional LS-SVM while provide comparable predictive classification accuracy. Additionally, the computational complexity is significantly decreased.

Published in:

Intelligent Control and Automation (WCICA), 2012 10th World Congress on

Date of Conference:

6-8 July 2012