By Topic

Viewing and reviewing how humanoids sensed, planned and behaved with Mixed Reality technology

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Kobayashi, K. ; Human Machine Perception Lab., Canon Inc., Tokyo ; Nishiwaki, K. ; Uchiyama, S. ; Yamamoto, H.
more authors

How can we see how humanoids sensed, planned, and behaved in the actual environment? We propose a mixed reality environment for viewing and reviewing of internal parameters computed by the humanoids against the actual environment. In the environment, the parameters snapped on each sampling time are treated as data streams, and stored to the distributed log servers in real time. 3-D graphical representation of the parameters helps us to observe plural multi-dimensional parameters. A video see-through head mounted display is used for viewing the representation. The stored parameters can be projected on the current actual scene from arbitrary viewpoints. By the viewing and reviewing functions, the mixed reality environment becomes a powerful tool for the autonomous behavior developer in order to debug the actual behavior by contrast with the actual environment. This paper describes the implementation of the system with a full size humanoid, HRP-2, and shows some experimental examples.

Published in:

Humanoid Robots, 2007 7th IEEE-RAS International Conference on

Date of Conference:

Nov. 29 2007-Dec. 1 2007