By Topic

Object segmentation in stereo image using cooperative line field in stochastic diffusion

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Sang Hwa Lee ; Object-based Broadcasting Syst. Res. Lab., NHK Eng. Services, Tokyo, Japan ; Kanatsugu, Y. ; Jong-Il Park

In the usual object segmentation, the main procedure is first motion estimation and then motion clustering to segment a few regions. This paper proposes a new approach to perform the segmentation using MAP-based motion estimation and the cooperative line field model. In this approach, the motion clustering or region classification are not performed to the segmentation, but the line field generates the segmentation of objects. The cooperative line field is based on the gradients of color and the estimated correspondence field in the MAP-based estimator. The stochastic diffusion is an optimization method to search for the minimal potential in the MAP-based estimation. The estimated correspondence field is the depth information in the stereo image, and the final line field is the boundary to distinguish the objects from background or different objects. From the experiments, the estimation of the correspondence field is improved by the line field, and the cooperative line field shows good performances of the whole object segmentation and detailed contour extraction

Published in:

Image Processing, 2001. Proceedings. 2001 International Conference on  (Volume:3 )

Date of Conference: