Skip to Main Content
Smart phones such as the iPhone and Android phones have become increasingly popular in the recent market. Diverse applications can be implemented on smart phones since their performance has been improved. One of the promising application areas would be reading the emotional state of a human being and using it for communicating with the computer. In this paper, we present a robust facial expression system implemented on a smart phone, which has employed the Active Appearance Model (AAM). To fix degradation of the AAM's performance against illumination variation, a Difference Of Gaussian (DOG) kernel was used before the AAM stage. A neural network, which had been trained with the Cohn-Kanade Database, was used for classification of facial expression states. Results and future work are described.