By Topic

Scene Analysis for Object Detection in Advanced Surveillance Systems Using Laplacian Distribution Model

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Fan-Chieh Cheng ; Dept. of Electron. Eng., Nat. Taiwan Univ. of Sci. & Technol., Taipei, Taiwan ; Shih-Chia Huang ; Shanq-Jang Ruan

In this paper, we propose a novel background subtraction approach in order to accurately detect moving objects. Our method involves three important proposed modules: a block alarm module, a background modeling module, and an object extraction module. The block alarm module efficiently checks each block for the presence of either a moving object or background information. This is accomplished by using temporal differencing pixels of the Laplacian distribution model and allows the subsequent background modeling module to process only those blocks that were found to contain background pixels. Next, the background modeling module is employed in order to generate a high-quality adaptive background model using a unique two-stage training procedure and a novel mechanism for recognizing changes in illumination. As the final step of our process, the proposed object extraction module will compute the binary object detection mask through the applied suitable threshold value. This is accomplished by using our proposed threshold training procedure. The performance evaluation of our proposed method was analyzed by quantitative and qualitative evaluation. The overall results show that our proposed method attains a substantially higher degree of efficacy, outperforming other state-of-the-art methods by Similarity and F1 accuracy rates of up to 35.50% and 26.09%, respectively.

Published in:

Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on  (Volume:41 ,  Issue: 5 )