By Topic

Fast training algorithm by Particle Swarm Optimization and random candidate selection for rectangular feature based boosted detector

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Hidaka, A. ; Syst. & Inf. Eng., Univ. of Tsukuba, Tsukuba ; Kurita, T.

Adaboost is an ensemble learning algorithm that combines many base-classifiers to improve their performance. Starting with Viola and Jonespsila researches, Adaboost has often been used to local feature selection for object detection. Adaboost by Viola-Jones consists of following two optimization schemes: (1) training of the local features to make base-classifiers, and (2) selection of the best local feature. Because the number of local features becomes usually more than tens of thousands, the learning algorithm is time consuming if the two optimizations are completely performed. To omit the unnecessary redundancy of the learning, we propose fast boosting algorithms by using Particle Swarm Optimization (PSO) and random candidate selection (RCS). Proposed learning algorithm is 50 times faster than the usual Adaboost while keeping comparable classification accuracy.

Published in:

Neural Networks, 2008. IJCNN 2008. (IEEE World Congress on Computational Intelligence). IEEE International Joint Conference on

Date of Conference:

1-8 June 2008