Abstract:
This work presents a novel vision-based navigation strategy for autonomous humanoid robots using augmented reality (AR). In the first stage, a platform is developed for i...Show MoreMetadata
Abstract:
This work presents a novel vision-based navigation strategy for autonomous humanoid robots using augmented reality (AR). In the first stage, a platform is developed for indoor and outdoor human location positioning and navigation using mobile augmented reality. The image sequence would be obtained by a smart phone's camera and the location information will be provided to the user in the form of 3D graphics and audio effects containing location information. To recognize a location, an image database and location model is pre-constructed to relate the detected AR-marker's position to the map of environment. The AR-markers basically act as active landmarks placed in undiscovered environments, sending out location information once detected by a camera. The second stage implements the same algorithm on an autonomous humanoid robot to be used as its navigation module. This is achieved by coupling the robot odometry and inertial sensing with the visual marker detection module. Using this system, the robot employs its vision system to enhance its localization robustness and allow quick recovery in lost situations by detecting the active landmarks or the so called AR-markers. The problem of motion blur resulting from the 6-DOF motion of humanoid's camera is solved using an adaptive thresholding technique developed to increase the robustness of the augmented reality marker detection under different illumination conditions and camera movements. For our experiments, we used the humanoid robot NAO and verified the performance of this navigation methodology in real-world scenarios.
Published in: 2011 IEEE International Conference on Mechatronics
Date of Conference: 13-15 April 2011
Date Added to IEEE Xplore: 01 August 2011
ISBN Information: