By Topic

Testing Adaptive Local Hyperplane for multi-class classification by double cross-validation

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Tao Yang ; Fac. of Eng. & Inf. Technol., Univ. of Technol., Sydney, NSW, Australia ; Vojislav, K. ; Cao Longbing ; Zhang Chengqi

Adaptive Local Hyperplane (ALH) is a recently proposed classifier for the multi-class classification problems and it has shown encouraging performance in many pattern recognition problems. However, ALH's performance over many general classification datasets has only been tested by using a single loop of cross-validation procedure, where the whole datasets are used for both hyper-parameter determination and accuracy estimation. This procedure is appropriate for classifier performance comparison, but the produced results are likely to be optimistic for classifier accuracy estimation on new datasets. In this paper, we test the performance of ALH as well as several other benchmark classifiers by using two loops of cross-validation (a.k.a. double resampling) procedure, where the inner loop is used for hyper-parameter determination and the outer loop is used for accuracy estimation. With such a testing scheme, the classification accuracy of a tested classifier can be evaluated in a more strict way. The experimental results indicate the superior performance of the ALH classifier with respect to the traditional classifiers including Support Vector Machine (SVM), K-Nearest Neighbor (KNN), Linear Discriminant Analysis (LDA), Classification Tree (Tree) and K-local Hyperplane distance Nearest Neighbor (HKNN). These results imply that the ALH classifier might become a useful tool for the pattern recognition tasks.

Published in:

Neural Networks (IJCNN), The 2010 International Joint Conference on

Date of Conference:

18-23 July 2010