By Topic

Real-time 3D motion and structure of point features: a front-end system for vision-based control and interaction

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Jin, H. ; Univ. of Washington, St. Louis, MO, USA ; Favaro, P. ; Soatto, S.

We present a system that consists of one camera connected to a personal computer that can (a) select and track a number of high-contrast point features on a sequence of images, (b) estimate their three-dimensional motion and position relative to an inertial reference frame, assuming rigidity, (c) handle occlusions that cause point-features to disappear as well as new features to appear. The system can also (d) perform partial self-calibration and (e) check for consistency of the rigidity assumption, although these features are not implemented in the current release. All of this is done automatically and in real-time (30 Hz) for 40-50 point features using commercial off-the-shelf hardware. The system is based on an algorithm presented by Chiuso et al. (2000), the properties of which have been analyzed by Chiuso and Soatto (2000). In particular, the algorithm is provably observable, provably minimal and provably stable- under suitable conditions. The core of the system, consisting of C++ code ready to interface with a frame grabber as well as Matlab code for development, is available at http://ee.wustl.edu/-soatto/research.html. We demonstrate the system by showing its use as (1) an ego-motion estimator, (2) an object tracker, and (3) an interactive input device, all without any modification of the system settings.

Published in:

Computer Vision and Pattern Recognition, 2000. Proceedings. IEEE Conference on  (Volume:2 )

Date of Conference:

15-15 June 2000