Skip to Main Content
AdaBoost is one of the most popular algorithms to construct a strong classifier with linear combination of member classifiers. The member classifiers are selected to minimize the errors in each iteration step during training process. AdaBoost provides very simple and useful method to generate ensemble classifiers. The performance of the ensemble depends on the diversity among the member classifiers as well as the performance of each member classifiers. However the existing AdaBoost algorithms are focused on error minimization problems. In this paper, we propose a noble method to inject diversity into the AdaBoost process to improve the performance of the AdaBoost classifiers. The proposed Diverse AdaBoost algorithm outperforms Gentle AdaBoost algorithm, because of the injected diversity. Our research contributes to the method designing optimized ensemble classifiers with diversity.