Skip to Main Content
The paper demonstrates how ultrasonic hand tracking can be used to improve the performance of a wearable, accelerometer and gyroscope based activity recognition system. Specifically we target the recognition of manipulative gestures of the type found in assembly and maintenance tasks. We discuss how relevant information can be extracted from the ultrasonic signal despite problems with low sampling rate, occlusions and reflections that occur in this type of application. We then introduce several methods of fusing the ultrasound and motion sensor information. We evaluate our methods on an experimental data set that contains 21 different actions performed repeatedly by three different subjects during simulated bike repair. Due to the complexity of the recognition tasks with many similar and vaguely defined actions and person independent training both the ultrasound and motion sensors perform poorly on their own. However with our fusion methods recognition rates well over 90% can be achieved for most activities. In extreme case recognition rates go up from just over 50% for separate classifications to nearly 89% with our fusion methods.