Skip to Main Content
A method to segment a floor and an obstacle region from images is a fundamental function for robots in real world. This paper describes a floor detection method by integrating binocular stereo vision and whole body information, for walking direction control of a humanoid robot. We developed the humanoid robot navigation system using vision-based local floor map. The developed system consists of a map building stage and a walking direction control stage. In the map building stage, the system builds a local floor map around a robot by integrating floor region information from visual input and whole body posture information. Plane segment finder (PSF) algorithm, which is able to extract planner surface from 3D vision input, is utilized to segment a floor and an obstacle regions. Floor region segmentation from input images is represented in view coordinates, then the whole body posture information is utilized to transform from view coordinates to body coordinate to build a local floor map. In another stage, the system search for open space direction on the local floor map and control walking direction towards open space to avoid obstacles. Finally, walking navigation experiments based on floor detection using a life-size humanoid robot are shown.