By Topic

Two Criteria for Model Selection in Multiclass Support Vector Machines

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Lei Wang ; Res. Sch. of Inf. Sci. & Eng., Australian Nat. Univ., Canberra, ACT ; Ping Xue ; Kap Luk Chan

Practical applications call for efficient model selection criteria for multiclass support vector machine (SVM) classification. To solve this problem, this paper develops two model selection criteria by combining or redefining the radius-margin bound used in binary SVMs. The combination is justified by linking the test error rate of a multiclass SVM with that of a set of binary SVMs. The redefinition, which is relatively heuristic, is inspired by the conceptual relationship between the radius-margin bound and the class separability measure. Hence, the two criteria are developed from the perspective of model selection rather than a generalization of the radius-margin bound for multiclass SVMs. As demonstrated by extensive experimental study, the minimization of these two criteria achieves good model selection on most data sets. Compared with the k-fold cross validation which is often regarded as a benchmark, these two criteria give rise to comparable performance with much less computational overhead, particularly when a large number of model parameters are to be optimized.

Published in:

Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on  (Volume:38 ,  Issue: 6 )