By Topic

Complete calibration of a multi-camera network

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Baker, P. ; Comput. Vision Lab., Maryland Univ., College Park, MD, USA ; Aloimonos, Y.

We describe a calibration procedure for a multi-camera rig. Consider a large number of synchronized cameras arranged in some space, for example, on the walls of a room looking inwards. It is not necessary for all the cameras to have a common field of view, as long as every camera is connected to every other camera through common fields of view. Switching off the lights and waving a wand with an LED at the end of it, we can capture a very large set of point correspondences (corresponding points are captured at the same time stamp). The correspondences are then used in a large, nonlinear eigenvalue minimization routine whose basis is the epipolar constraint. The eigenvalue matrix encapsulates all points correspondences between every pair of cameras in a way that minimizing the smallest eigenvalue results in the projection matrices, to within a single perspective transformation. In a second step, given additional data from waving a rod with two LEDs (one at each end) the full projection matrices are calculated. The method is extremely accurate-the reprojections of the reconstructed points were within a pixel

Published in:

Omnidirectional Vision, 2000. Proceedings. IEEE Workshop on

Date of Conference: