Skip to Main Content
Our interaction with machines is always severely constrained by unnatural interfaces such as mouse, keyboard and joystick. Such interfaces make it difficult for us to convey our ideas in order for computers to understand and perform our intended tasks. In this research, our aim is to build systems that use natural human actions as interfaces. In particular we exploit gaze to track a person's focus of attention. We present an active gaze tracking system that enables a user to instruct a robot arm to pick up and hand over an object placed arbitrarily in 3D space. Our system determines the precise 3D position of the object of unknown size, shape and color by following the person's steady gaze.
Date of Conference: 19-21 May 2005