By Topic

Acoustic and visual signal based context awareness system for mobile application

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Woo-Hyun Choi ; Sch. of Electr. Eng., Korea Univ., Seoul, South Korea ; Seung-Il Kim ; Min-Seok Keum ; Han, D.K.
more authors

In this paper, an acoustic and visual signal based context awareness system is proposed for a mobile application. In particular multimodal system is designed that can sense and determine, in real-time, user contextual information, such as where the user is or what the user does, by processing acoustic and visual signals from the suitable sensors available in a mobile device. A variety of contextual information, such as babble sound in cafeteria, user¿s movement, and etc., can be recognized by the proposed acoustic and visual feature extraction and classification methods. We first describe the overall structure of the proposed system and then the algorithm for each module performing detection or classification of various contextual scenarios is presented. Representative experiments demonstrate the superiority of the proposed system while the actual implementation of the proposed scheme into mobile device such as a smart-phone confirms the effectiveness and realization of the proposed system.

Published in:

Consumer Electronics, IEEE Transactions on  (Volume:57 ,  Issue: 2 )