By Topic

A Human-Machine Interface for assistive exoskeleton based on face analysis

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Malek Baklouti ; Faculty of Versailles, Robotic and Signal processing, France ; Michael Bruin ; Vincent Guitteny ; Eric Monacelli

This paper proposes a human machine interface for assistive exoskeleton based on face analysis. The 4 DoF assistive robotic system designed is dedicated to people suffering from myopathy and aims to compensate for the loss of mobility in the upper limb. The proposed interface is able to convert user head gesture and mouth expression into a suitable control command. Moreover, we propose a visual context analysis component to make a more accurate command. The tests conducted show that the use of vision based interface is particularly adapted to disabled people. In this paper, we will first describe the problematic and the designed mechanical system. Next, we will describe the two approaches developed for visual sensing interface: head control and mouth expression control. We will focus on mouth extraction algorithm. Finally, we introduce the context detection for scene understanding.

Published in:

2008 2nd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics

Date of Conference:

19-22 Oct. 2008