Skip to Main Content
This paper describes an intelligent wheelchair with a novel interface which uses visual oral motion. The wheelchair runs in various lighting environments, traditional mouth region detection method did not consider these conditions. To achieve detection of mouth region and analysis mouth motion in real-time and in various lighting environments, we proposed mouth cavity region detection method. In our system, the user moves his mouth open and close to run the wheelchair. We carried out six operational experiments and running experiments. Experiments with five subjects obtained our system is supposed to use a certain extent in a campus corridor.