Skip to Main Content
Pattern recognition-based multifunction prosthesis control strategies have largely been demonstrated with subsets of typical able-bodied hand movements. These movements are often unnatural to the amputee, necessitating significant user training and do not maximally exploit the potential of residual muscle activity. This paper presents a real-time electromyography (EMG) classifier of user-selected intentional movements rather than an imposed subset of standard movements. EMG signals were recorded from the forearm extensor and flexor muscles of seven able-bodied participants and one congenital amputee. Participants freely selected and labeled their own muscle contractions through a unique training protocol. Signals were parameterized by the natural logarithm of root mean square values, calculated within 0.2 s sliding and non overlapping windows. The feature space was segmented using fuzzy C-means clustering. With only 2 min of training data from each user, the classifier discriminated four different movements with an average accuracy of 92.7% plusmn 3.2%. This accuracy could be further increased with additional training data and improved user proficiency that comes with practice. The proposed method may facilitate the development of dynamic upper extremity prosthesis control strategies using arbitrary, user-preferred muscle contractions.