We propose a robotic wheelchair that observes the user and the environment. It can understand the user's intentions from his/her behaviors and the environmental information. It also observes the user when he/she is off the wheelchair, recognizing the user's commands indicated by hand gestures. Experimental results show our approach to be promising. Although the current system uses face direction, for people who find it difficult to move their faces, it can be modified to use the movements of the mouth, eyes, or any other body parts that they can move. Since such movements are generally noisy, the integration of observing the user and the environment will be effective in understanding the real intentions of the user and will be a useful technique for better human interfaces.