Skip to Main Content
The neural-based control of a robotic hand has many clinical and engineering applications. Current approaches to this problem have been limited due to a lack of understanding of the relationship between neural signals and dynamic finger movements. Here, we present a technique to predict index finger joint angles from neural signals recorded from the associated muscles. The neural signals are converted to a torque estimate (EBTE) and then input to artificial neural networks. The networks predict the finger position more closely when the input to the networks are torque estimates rather than neural signals. Furthermore, the networks trained with the EBTE signals could predict the joint angles for different phases of finger movements (i.e. dynamic reaching and positioning task) while networks trained with the neural signals could not. Our results indicate that (1) similar finger movements are executed with different synergistic strategies and (2) different phases of finger movements employ different neural strategies. Through these results, we have demonstrated the first concrete technique to control a hand prosthetic device or dexterous tele-manipulator using natural neural control signals.