Skip to Main Content
We present a framework for 3D spatial gesture design and modeling. A wearable input device that facilitates the use of visual sensors and body sensors is proposed for gesture acquisition. We adapted two different pattern matching techniques, Dynamic Time Warping (DTW) and Hidden Markov Models (HMMs), to support the registration and evaluation of 3D spatial gestures as well as their recognition. One key ingredient of our framework is a concept for the convenient gesture design and registration using HMMs. DTW is used to recognize gestures with a limited training data, and evaluate how the performed gesture is similar to its template gesture. In our experimental evaluation, we designed 18 example gestures and analyzed the performance of recognition methods and gesture features under various conditions. We discuss the variability between users in gesture performance.