By Topic

Minimum Class Variance Support Vector Machines

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Zafeiriou, S. ; Aristotle Univ. of Thessaloniki, Thessaloniki ; Tefas, A. ; Pitas, I.

In this paper, a modified class of support vector machines (SVMs) inspired from the optimization of Fisher's discriminant ratio is presented, the so-called minimum class variance SVMs (MCVSVMs). The MCVSVMs optimization problem is solved in cases in which the training set contains less samples that the dimensionality of the training vectors using dimensionality reduction through principal component analysis (PCA). Afterward, the MCVSVMs are extended in order to find nonlinear decision surfaces by solving the optimization problem in arbitrary Hilbert spaces defined by Mercer's kernels. In that case, it is shown that, under kernel PCA, the nonlinear optimization problem is transformed into an equivalent linear MCVSVMs problem. The effectiveness of the proposed approach is demonstrated by comparing it with the standard SVMs and other classifiers, like kernel Fisher discriminant analysis in facial image characterization problems like gender determination, eyeglass, and neutral facial expression detection.

Published in:

Image Processing, IEEE Transactions on  (Volume:16 ,  Issue: 10 )