By Topic

Use of forehead bio-signals for controlling an Intelligent Wheelchair

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Lai Wei ; Dept. of Comput. & Electron. Syst., Univ. of Essex, Colchester ; Huosheng Hu ; Kui Yuan

This paper presents a novel method to classify human facial movement based on multi-channel forehead bio-signals. Five face movements form three face regions: forehead, eye and jaw are selected and classified in back propagation artificial neural networks (BPANN) by using a combination of transient and steady features from EMG and EOG waveforms. The identified face movements are subsequently employed to generate five control commands for controlling a simulated intelligent wheelchair. A human-machine interface (HMI) is designed to map movement patterns into corresponding control commands via a logic control table. The simulation result shows the feasibility and performance of the proposed system, which can be extended into real-world applications as a control interface for disabled and elderly users.

Published in:

Robotics and Biomimetics, 2008. ROBIO 2008. IEEE International Conference on

Date of Conference:

22-25 Feb. 2009