By Topic

Virtual view synthesis with heuristic spatial motion

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Wenfeng Li ; Dept. of Comput. Sci. & Eng., Arizona State Univ., Tempe, AZ ; Baoxin Li

Probabilistic methods have been used in image-based rendering for solving the virtual view synthesis problem with Bayesian inference. To work well, the inference process requires the input views to be consistent to yield reasonable result, which in turn constrains the cameras to be very close to each other. Many approaches to relieving such constraint focus on the prior model. In this paper, we present a method which treats the virtual view as the outcome of a spatial motion from one real view. A sequence of images is generated heuristically to preserve textures with the aid of steerable filters. Interim results are further refined with texture-based Markov random field prior model. Experiments show that the synthesized view can have satisfactory image quality with only a few input images from wide baseline cameras.

Published in:

Image Processing, 2008. ICIP 2008. 15th IEEE International Conference on

Date of Conference:

12-15 Oct. 2008