By Topic

Semisupervised Least Squares Support Vector Machine

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Mathias M. Adankon ; Ecole de Technol. Super., Univ. of Quebec, Montreal, QC, Canada ; Mohamed Cheriet ; Alain Biem

The least squares support vector machine (LS-SVM), like the SVM, is based on the margin-maximization performing structural risk and has excellent power of generalization. In this paper, we consider its use in semisupervised learning. We propose two algorithms to perform this task deduced from the transductive SVM idea. Algorithm 1 is based on combinatorial search guided by certain heuristics while Algorithm 2 iteratively builds the decision function by adding one unlabeled sample at the time. In term of complexity, Algorithm 1 is faster but Algorithm 2 yields a classifier with a better generalization capacity with only a few labeled data available. Our proposed algorithms are tested in several benchmarks and give encouraging results, confirming our approach.

Published in:

IEEE Transactions on Neural Networks  (Volume:20 ,  Issue: 12 )