Skip to Main Content
In this paper, we present image-based methods for robust recognition of static and dynamic hand gestures in real-time. These methods are used for an intuitive interaction with an assistance-system in which the skin-tones are used to segment the hands. The segmentation builds the basis of feature extraction for the static and dynamic gestures. In the static gestures, the activation of particular region leads us to associated actions whereas HMM classifier is used to extract the dynamic gestures dependent upon the flow. The assistance-system supports the workers in manual working tasks in the context of assembling complex products. This paper is focused on the interaction of the user with this system and describes the work in progress with the initial results from an application scenario.