Skip to Main Content
Recent advances in neuroscience and humanoid robotics have allowed initial demonstrations of brain-computer interfaces (BCIs) for controlling humanoid robots. However, previous BCIs have relied on higher-level control based on fixed pre-wired behaviors. On the other hand, low-level control can be tedious, imposing a high cognitive load on the BCI user. To address these problems, we previously proposed an adaptive hierarchical approach to brain-computer interfacing: users teach the BCI system new skills on-the-fly; these skills can later be invoked directly as high-level commands, relieving the user of tedious control. In this paper, we explore the application of hierarchical BCIs to the task of controlling a PR2 humanoid robot and teaching it new skills. We further explore the use of explicitly-defined sequences of commands as a way for the user to define a more complex task involving multiple state spaces. We report results from three subjects who used a hierarchical electroencephalogram (EEG)-based BCI to successfully train and control the PR2 humanoid robot in a simulated household task maneuvering the robot's arm to pour milk over a bowl of cereal. We present the first demonstration of training a hierarchical BCI for a non-navigational task. This is also the first demonstration of using one to train a more complex task involving multiple state spaces.