Skip to Main Content
Brain-machine interfaces (BMIs) provide a versatile tool for rehabilitation of severely disabled people. Current BMI systems focus on the control of kinematic variables. However, this approach limits the application space of BMI technology to simulated environments. Real-world rehabilitation robots, on the other hand, must operate in a variety of complex physical situations. BMI systems that are aimed toward prostheses must, then, control interaction forces with their environments. In this paper, we design a BMI-driven architecture that provides a critical link between neuronal ensemble activity and real-world dynamics. In particular, our system allows simultaneous estimation of kinematic and stiffness variables of a prosthetic device. This approach is achieved by recording instantaneous activities from cortical neural ensembles that input to a musculoskeletal model of the arm, from which limb kinematics and dynamics are estimated and converted into control signals of a prosthetic device. Using real neural and behavioral data from nonhuman primates, we show that our architecture can accurately predict kinematic and stiffness variables in different dynamic situations. This architecture has strong implications in the development of the next generation of neural prosthetics that will restore motor function in neurologically impaired patients. Moreover, it demonstrates a novel framework for studying how the brain learns and adapts to new environments.