Skip to Main Content
The current trend toward robot assistants and the execution of autonomous tasks in minimally invasive surgery increases the operation complexity of telepresence systems. The available input channels are currently limited to traditional human-computer interfaces. We introduce two human-robot interfacing modalities that aim to make robotic surgery more intuitive. To reduce the surgeon's mental load, gaze-contingent camera control is implemented. Eye tracking is performed by means of head worn tracking goggles. The tracking goggles are tightly integrated with a stereoscopic visualization system, based on the polarization method. The second technique supports scrub nurses during surgical tool interaction, e.g. tool exchange, via haptic gestures executed on the robot. Strain-gauges sensors installed at the instrument are used to detect hand tapping sequences, which trigger activation of specified commands.