By Topic

Multiclass From Binary: Expanding One-Versus-All, One-Versus-One and ECOC-Based Approaches

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Rocha, A. ; Inst. of Comput., Univ. of Campinas, Campinas, Brazil ; Klein Goldenstein, S.

Recently, there has been a lot of success in the development of effective binary classifiers. Although many statistical classification techniques have natural multiclass extensions, some, such as the support vector machines, do not. The existing techniques for mapping multiclass problems onto a set of simpler binary classification problems run into serious efficiency problems when there are hundreds or even thousands of classes, and these are the scenarios where this paper's contributions shine. We introduce the concept of correlation and joint probability of base binary learners. We learn these properties during the training stage, group the binary leaner's based on their independence and, with a Bayesian approach, combine the results to predict the class of a new instance. Finally, we also discuss two additional strategies: one to reduce the number of required base learners in the multiclass classification, and another to find new base learners that might best complement the existing set. We use these two new procedures iteratively to complement the initial solution and improve the overall performance. This paper has two goals: finding the most discriminative binary classifiers to solve a multiclass problem and keeping up the efficiency, i.e., small number of base learners. We validate and compare the method with a diverse set of methods of the literature in several public available datasets that range from small (10 to 26 classes) to large multiclass problems (1000 classes) always using simple reproducible scenarios.

Published in:

Neural Networks and Learning Systems, IEEE Transactions on  (Volume:25 ,  Issue: 2 )