Skip to Main Content
A diversity of action-perception applications relies on 2D information to perform 3D tasks. Minimally invasive surgery (MIS) is one of these applications. It requires a high degree of sensory-motor skills to overcome the disengagement between action and perception caused by the physical separation of the surgeon with the operative site. The integration of body movements with visual information serves to assist the surgeon providing a sense of position. Our purpose in this paper is to present the exterior orientation as a tool in assisted interventions locating the instruments with respect to the surgeon. An enhanced perception is obtained by augmenting the 2D information imposing position and orientation data through a human-machine interface. Applying motion analysis in a sequence of images and having knowledge of the 3D transformations implemented to the instrument, we show it is possible to estimate its orientation with only two different rotations and also its position in the case its length information is supplied.