By Topic

Single View Camera Calibration for Augmented Virtual Environments

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Lu Wang ; Comput. Graphics & Immersive Technol. Lab., Southern California Univ. ; You, S. ; Neumann, U.

Augmented virtual environments (AVE) are very effective in the application of surveillance, in which multiple video streams are projected onto a 3D urban model for better visualization and comprehension of the dynamic scenes. One of the key issues in creating such systems is to estimate the parameters of each camera including the intrinsic parameters and its pose relative to the 3D model. Existing camera pose estimation approaches require known intrinsic parameters and at least three 2D to 3D feature (point or line) correspondences. This cannot always be satisfied in an AVE system. Moreover, due to noise, the estimated camera location may be far from the expectation of the users when the number of correspondences is small. Our approach combines the users' prior knowledge about the camera location and the constraints from the parallel relationship between lines with those from feature correspondences. With at least two feature correspondences, it can always output an estimation of the camera parameters that gives an accurate alignment between the projection of the image (or video) and the 3D model

Published in:

Virtual Reality Conference, 2007. VR '07. IEEE

Date of Conference:

10-14 March 2007