Skip to Main Content
In a human-robot interface, the prediction of motion, which is based on context information of a task, has the potential to improve the robustness and reliability of motion classification to control prosthetic devices or human-assisting manipulators. This paper proposes a task model using a Bayesian network (BN) for motion prediction. Given information of the previous motion, this task model is able to predict occurrence probabilities of the motions concerned in the task. Furthermore, a hybrid motion classification framework has been developed based on the BN motion prediction. Besides the motion prediction, electromyogram (EMG) signals are simultaneously classified by a probabilistic neural network (NN). Then, the motion occurrence probabilities are combined with the NN classifier's outputs to generate motion commands for control. With the proposed motion classification framework, it is expected that classification performance can be enhanced so that motion commands can be more robust and reliable. Experiments have been conducted with four subjects to demonstrate the feasibility of the proposed methods. In these experiments, forearm motions are classified with EMG signals considering a cooking task. Finally, robot manipulation experiments were carried out to verify the proposed human interface system with a task of taking meal. The experimental results indicate that the proposed methods improved the robustness and stability of motion classification.