By Topic

Biometrics authentication method using lip motion in utterance

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Atsushi Sayo ; Faculty of Engineering Sceince, Kansai University, 3-3-35, Yamate-cho, Suita-si, Osaka 564-8680, Japan ; Yoshinobu Kajikawa ; Mitsuji Muneyasu

In this paper, we propose a biometrics authentication method using lip motion in utterance. The proposed authentication method authenticates persons using lip shape (physical trait) and lip motion accompanying utterance (behavioral trait). Hence, the proposed method can be realized with only a camera extracting lip area without the special equipment used in other personal authentication methods and can easily change the registration data. Conventional methods using a lip image use time differential coefficients of horizontal and vertical lip ranges, and their corresponding rates. However, these features cannot describe the lip motion in detail because the extraction of the exact lip shape is difficult. Therefore, the proposed authentication method uses features concerned with lip shape, which are extracted from a divided lip region, and extracts lip motion by dynamic time warping (DTW). Experimental results demonstrate that the proposed method can achieve an authentication rate of more than 99.5 %.

Published in:

Information, Communications and Signal Processing (ICICS) 2011 8th International Conference on

Date of Conference:

13-16 Dec. 2011