Skip to Main Content
Adaboost is an ensemble learning algorithm that combines many base-classifiers to improve their performance. Starting with Viola and Jonespsila researches, Adaboost has often been used to local feature selection for object detection. Adaboost by Viola-Jones consists of following two optimization schemes: (1) training of the local features to make base-classifiers, and (2) selection of the best local feature. Because the number of local features becomes usually more than tens of thousands, the learning algorithm is time consuming if the two optimizations are completely performed. To omit the unnecessary redundancy of the learning, we propose fast boosting algorithms by using Particle Swarm Optimization (PSO) and random candidate selection (RCS). Proposed learning algorithm is 50 times faster than the usual Adaboost while keeping comparable classification accuracy.