By Topic

A New Diverse AdaBoost Classifier

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Tae-Ki An ; KRRI, Sungkyunkwan Univ., Uiwang, South Korea ; Moon-Hyun Kim

AdaBoost is one of the most popular algorithms to construct a strong classifier with linear combination of member classifiers. The member classifiers are selected to minimize the errors in each iteration step during training process. AdaBoost provides very simple and useful method to generate ensemble classifiers. The performance of the ensemble depends on the diversity among the member classifiers as well as the performance of each member classifiers. However the existing AdaBoost algorithms are focused on error minimization problems. In this paper, we propose a noble method to inject diversity into the AdaBoost process to improve the performance of the AdaBoost classifiers. The proposed Diverse AdaBoost algorithm outperforms Gentle AdaBoost algorithm, because of the injected diversity. Our research contributes to the method designing optimized ensemble classifiers with diversity.

Published in:

Artificial Intelligence and Computational Intelligence (AICI), 2010 International Conference on  (Volume:1 )

Date of Conference:

23-24 Oct. 2010