By Topic

Head-tracking virtual 3-D display for mobile devices

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Lopez, M.B. ; Univ. of Oulu, Oulu, Finland ; Hannuksela, J. ; Silven, O. ; Lixin Fan

Computer vision enables camera data to be utilized in user interfaces to analyze the 3-D context and automatically detect the user intentions. Using cameras as an input modality provides single-handed operations in which the users' actions are recognized without interactions with the screen or keypad. In this context, we have constructed a real-time mobile application prototype where the user's position and gaze is determined in real time, a technique that enables the display of true three-dimensional objects even on a typical 2-D LCD screen. We have defined a series of interaction methods where the user's motion and camera input realistically control the viewpoint on a 3-D scene. The head movement and gaze can be used to interact with hidden objects in a natural manner just by looking at them. We provide a description of the embedded implementation at a system-level where we highlight the application development challenges and trade-offs that need to be dealt with battery powered mobile devices. The implementation includes a parallel pipeline that reduces the latencies of the application.

Published in:

Computer Vision and Pattern Recognition Workshops (CVPRW), 2012 IEEE Computer Society Conference on

Date of Conference:

16-21 June 2012