Skip to Main Content
Research on Robotics and Artificial Intelligence presently interprets the interaction between human and robotic subjects through a set of mediated controls who engage almost all human senses. The underlying communication requires the establishment of a set of interaction metaphors in order to allow a natural type of control. The most common of these metaphors is the kinematic copy of human gestures. During the last ten years incredible progresses in cognitive science, psychology, neuroscience, physiology and neurophysiology have been achieved. The knowledge about the relationship between the “perception and sensori-motion mechanisms” and the brain has never been so advanced. One of the major outcomes is in the fact that it is possible to correlate brain neuronal activities to corresponding perception and motion, and, at the same time, it is possible to record these activities through the use of innovative devices. The combined adoption of these results does allow dissolving the physical constraints that act as a boundary during the interaction and control of robots and virtual environment entities. A new generation of brain and body (computer) interfaces (BBCI) will allow direct transfer of user intention and the realization of remote sensing. In such a way it will be possible to achieve the interactive communication with a robot without the requirement to physically perform any action. Virtual and robotic bodies can be controlled and moved even in absence of a correspondent motion of the controlling subject. The robots is moved only by thoughts while the robot perception is transferred directly to the humans through worn interfaces (Head mounted displays, skin and body stimulators,...). This lecture addresses the path from teleoperation and virtual environment interaction toward new methods to recreate the illusion of surrogating our own bodies in different entities (being robotic or virtual) and investigates how the relevant perception-action loops- - will be affected.