Notification:
We are currently experiencing intermittent issues impacting performance. We apologize for the inconvenience.
By Topic

Kinectrack: 3D Pose Estimation Using a Projected Dense Dot Pattern

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
McIlroy, P. ; Dept. of Eng., Univ. of Cambridge, Cambridge, UK ; Izadi, S. ; Fitzgibbon, A.

Kinectrack is a novel approach to six-DoF tracking that provides agile real-time pose estimation using only commodity hardware. The dot pattern emitter and IR camera components of the standard Kinect device are separated to allow the emitter to roam freely relative to a fixed camera. The six-DoF pose of the emitter component is recovered by matching the dense dot pattern observed by the camera to a pre-captured reference image. A novel matching technique is introduced to obtain the dense dot pattern correspondences efficiently in wide- and adaptive-baseline scenarios that requires only a small subset of the full dense dot pattern to fall within the field of view of the fixed camera. An auto-calibration process is proposed in order to obtain the intrinsic parameters of the fixed camera and the internal dot pattern reference image of the emitter. The system simultaneously recovers the six-DoF pose of the emitter device and the piecewise planar 3D scene structure. Kinectrack provides a low-cost method for tracking an object without any on-board computation, with small size and only simple electronics. This paper extends the original ISMAR 2012 submission, including a demonstration of robust pose tracking for AR and examples of matching in planar and non-planar scenes.

Published in:

Visualization and Computer Graphics, IEEE Transactions on  (Volume:20 ,  Issue: 6 )