By Topic

Object-based estimation of dense motion fields

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
C. Stiller ; Corp. Res. & Dev. Robert Bosch GmbH, Hildesheim, Germany

Motion estimation belongs to key techniques in image sequence processing. Segmentation of the motion fields such that, ideally, each independently moving object uniquely corresponds to one region, is one of the essential elements in object-based image processing. This paper is concerned with unsupervised simultaneous estimation of dense motion fields and their segmentations. It is based on a stochastic model relating image intensities to motion information. Based on the analysis of natural images, a region-based model of motion-compensated prediction error is proposed. In each region the error is modeled by a white stationary generalized Gaussian random process. The motion field and its segmentation are themselves modeled by a compound Gibbs/Markov random field accounting for statistical bindings in spatial direction and along the direction of motion trajectories. The a posteriori distribution of the motion field for a given image sequence is formulated as an objective function, such that its maximization results in the MAP estimate. A deterministic multiscale relaxation technique with regular structure is employed for optimization of the objective function. Simulation results are in a good agreement with human perception for both the motion fields and their segmentations

Published in:

IEEE Transactions on Image Processing  (Volume:6 ,  Issue: 2 )