By Topic

Decoding Dexterous Finger Movements in a Neural Prosthesis Model Approaching Real-World Conditions

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Joshua Egan ; Department of Bioengineering, University of Utah, Salt Lake City, UT, USA ; Justin Baker ; Paul A. House ; Bradley Greger

Dexterous finger movements can be decoded from neuronal action potentials acquired from a nonhuman primate using a chronically implanted Utah Electrode Array. We have developed an algorithm that can, after training, detect and classify individual and combined finger movements without any a priori knowledge of the data, task, or behavior. The algorithm is based on changes in the firing rates of individual neurons that are tuned for one or more finger movement types. Nine different movement types, which consisted of individual flexions, individual extensions, and combined flexions of the thumb, index finger, and middle finger, were decoded. The algorithm performed reliably on data recorded continuously during movement tasks, including a no-movement state, with an overall average sensitivity and specificity that were both >;92%. These results demonstrate a viable algorithm for decoding dexterous finger movements under conditions similar to those required for a real-world neural prosthetic application.

Published in:

IEEE Transactions on Neural Systems and Rehabilitation Engineering  (Volume:20 ,  Issue: 6 )