By Topic

Simultaneous categorical and spatio-temporal 3D gestures using Kinect

Sign In

Full text access may be available.

To access full text, please use your member or institutional sign in.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Bigdelou, A. ; Comput. Aided Med. Procedures (CAMP), Tech. Univ. Munchen, Munich, Germany ; Benz, T. ; Schwarz, L. ; Navab, N.

Recent technological advances have led to an increasing popularity of 3D gesture-based interfaces, in particular in gaming and entertainment consoles. However, unlike 2D gestures, which have been successfully utilized in many multi-touch devices, developing a 3D gesture-based interface is not an easy endeavor. Reasons include the complexity of capturing human movements in 3D and the difficulties associated with recognizing gestures from human motion data. In this work, we target the latter problem by proposing a novel gesture recognition technique for skeletal input data that simultaneously allows for categorical and spatio-temporal gestures. In other words, it recognizes the gesture type and the relative pose within a gesture at the same time. Moreover, our method can learn gestures that are most appropriate for the user from examples. In order to avoid the need for user-specific training, we further propose and evaluate several types of feature representations for human pose data. We argue how our approach can facilitate the development of a customizable 3D gesture-based interface and explore possibilities in order to smoothly integrate the proposed recognition approach into available component-based user interface frameworks. Besides a quantitative evaluation, we present a user study in the scenario of a 3D gesture-based interface for an intra-operative medical image viewer. Our studies support the applicability of our method for developing 3D gesture-based interfaces in practice.

Published in:

3D User Interfaces (3DUI), 2012 IEEE Symposium on

Date of Conference:

4-5 March 2012