Scheduled System Maintenance:
Some services will be unavailable Sunday, March 29th through Monday, March 30th. We apologize for the inconvenience.
By Topic

Look where you're going [robotic wheelchair]

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

The purchase and pricing options are temporarily unavailable. Please try again later.
3 Author(s)

We propose a robotic wheelchair that observes the user and the environment. It can understand the user's intentions from his/her behaviors and the environmental information. It also observes the user when he/she is off the wheelchair, recognizing the user's commands indicated by hand gestures. Experimental results show our approach to be promising. Although the current system uses face direction, for people who find it difficult to move their faces, it can be modified to use the movements of the mouth, eyes, or any other body parts that they can move. Since such movements are generally noisy, the integration of observing the user and the environment will be effective in understanding the real intentions of the user and will be a useful technique for better human interfaces.

Published in:

Robotics & Automation Magazine, IEEE  (Volume:10 ,  Issue: 1 )