Skip to Main Content
Our environment is replete with visual cues intended to guide human navigation. For example, there are building directories at entrances and room numbers next to doors. By developing a robot wheelchair system that can interpret these cues, we will create a more robust and more usable system. This paper describes the design and development of our robot wheelchair system, called Wheeley, and its vision-based navigation system. The robot wheelchair system uses stereo vision to build maps of the environment through which it travels; this map can then be annotated with information gleaned from signs. We also describe the planned integration of an assistive robot arm to help with pushing elevator buttons and opening door handles.