Scheduled System Maintenance:
On Monday, April 27th, IEEE Xplore will undergo scheduled maintenance from 1:00 PM - 3:00 PM ET (17:00 - 19:00 UTC). No interruption in service is anticipated.
By Topic

An active boosting-based learning framework for real-time hand detection

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Thuy Thi Nguyen ; Inst. for Comput. Graphics & Vision, Graz Univ. of Technol., Graz ; Nguyen Dang Binh ; Bischof, H.

Human hand detection problem has important applications in sign language and human machine interfaces. In this work, we present a novel approach for learning a vision-based hand detection system. The main contribution is a robust on-line boosting-based framework for real-time detection of a hand in unconstrained environments. The use of efficient representative features allows fast computation while dealing with vast changing of hand appearances and background. Interactive on-line training allows efficiently train and improve the detector. Moreover, we propose a strategy to efficiently improve the performance meanwhile reduce hand labeling effort. Besides, if necessary, we use a verification process to prevent ldquodriftingrdquo of classifier over time. The proposed method is practically favorable as it meets the requirements of real-time performance, accuracy and robustness. It works well with reasonable amount of training samples and is computational efficient. Experiments for detection of hands in challenging data sets show the outperform of our approach.

Published in:

Automatic Face & Gesture Recognition, 2008. FG '08. 8th IEEE International Conference on

Date of Conference:

17-19 Sept. 2008