By Topic

A unified approach for motion analysis and view synthesis

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
A. Rav-Acha ; Sch. of Comput. Sci. & Eng., The Hebrew Univ. of Jerusalem, Israel ; S. Peleg

Image based rendering (IBR) consists of several steps: (i) calibration (or ego-motion computation) of all input images, (ii) determination of regions in the input images used to synthesize the new view. (iii) interpolating the new view from the selected areas of the input images. We propose a unified representation for all these aspects of IBR using the space-time (x-y-t) volume. The presented approach is very robust, and allows to use IBR in general conditions even with a hand-held camera. To take care of (i), the space-time volume is constructed by placing frames at locations along the time axis so that image features create straight lines in the EPI (epipolar plane images). Different slices of the space-time volume are used to produce new views, taking care of (ii). Step (iii) is done by interpolating between image samples using the feature lines in the EPI images. IBR examples are shown for various cases: sequences taken from a driving car, from a handheld camera, or when using a tripod.

Published in:

3D Data Processing, Visualization and Transmission, 2004. 3DPVT 2004. Proceedings. 2nd International Symposium on

Date of Conference:

6-9 Sept. 2004