By Topic

Active Vision During Coordinated Head/Eye Movements in a Humanoid Robot

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Xutao Kuang ; Dept. of Psychol., Boston Univ., Boston, MA, USA ; Gibson, M. ; Shi, B.E. ; Rucci, M.

While looking at a point in the scene, humans continually perform smooth eye movements to compensate for involuntary head rotations. Since the optical nodal points of the eyes do not lie on the head rotation axes, this behavior yields useful 3-D information in the form of visual parallax. Here, we describe the replication of this behavior in a humanoid robot. We have developed a method for egocentric distance estimation based on the parallax that emerges during compensatory head/eye movements. This method was tested in a robotic platform equipped with an anthropomorphic neck and two binocular pan-tilt units specifically designed to reproduce the visual input signals experienced by humans. We show that this approach yields accurate and robust estimation of egocentric distance within the space nearby the agent. These results provide a further demonstration of how behavior facilitates the solution of complex perceptual problems.

Published in:

Robotics, IEEE Transactions on  (Volume:28 ,  Issue: 6 )