Skip to Main Content
We present the development of a robot system with some cognitive capabilities, as well as experimental results. We focus on two topics: assembly by two hands and understanding human instructions in nonconstrained natural language. These two features distinguish human beings from animals, and are, thus, the means leading to high-level intelligence. A typical application of such a system is a human-robot cooperative assembly. A human communicator sharing a view of the assembly scenario with the robot instructs the latter by speaking to it in the same way that he would communicate with a child whose common-sense knowledge is limited. His instructions can be underspecified, incomplete, and/or context dependent. After introducing the general purpose of our research project, we present the hardware and software components of our robots needed for interactive assembly tasks. We then discuss the control architecture of the robot system with two stationary robot arms by describing the functionalities of perception, instruction understanding, and execution. To show how our robot learns from humans, the implementations of a layered learning methodology, memory, and monitoring functions are introduced. Finally, we outline a list of future research topics related to the enhancement of such systems.