By Topic

Kernel second-order discriminants versus support vector machines

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Abdallah, F. ; Lab. LM2S, Univ. de Tech. de Troyes, France ; Richard, C. ; Lengelle, R.

Support vector machines (SVMs) are the most well known nonlinear classifiers based on the Mercer kernel trick. They generally lead to very sparse solutions that ensure good generalization performance. Recently, S. Mika et al. (see Advances in Neural Networks for Signal Processing, p.41-8, 1999) proposed a new nonlinear technique based on the kernel trick and the Fisher criterion: the nonlinear kernel Fisher discriminant (KFD). Experiments show that KFD is competitive with SVM classifiers. Nevertheless, it can be shown that there exist distributions such that even though the two classes are linearly separable, the Fisher linear discriminant has an error probability close to 1. We propose an alternative strategy based on Mercer kernels that consists in picking the optimum nonlinear receiver in the sense of the best second-order criterion. We also present a strategy for controlling the complexity of the resulting classifier. Finally, we compare this new method with SVM and KFD.

Published in:

Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03). 2003 IEEE International Conference on  (Volume:6 )

Date of Conference:

6-10 April 2003