Skip to Main Content
This research reports the recognition of facial movements during unvoiced speech and the identification of hand gestures using surface Electromyogram (sEMG). The paper proposes two different methods for identifying facial movements and hand gestures, which can be useful for providing simple commands and control to computer, an important application of HCI. Experimental results demonstrate that the features of sEMG recordings are suitable for characterising the muscle activation during unvoiced speech and subtle gestures. The scatter plots from the two methods demonstrate the separation of data for each corresponding vowel and each hand gesture. The results indicate that there is small inter-experimental variation but there are large intersubject variations. This inter-subject variation may be attributable to anatomical differences and different speed and style of speaking for the different subjects. The proposed system provides better results when is trained and tested by individual user. The possible applications of this research include giving simple commands to computer for disabled, developing prosthetic hands, use of classifying sEMG for HCI based systems.