Skip to Main Content
Statistical methods of shape and appearance are powerful tools used in computer vision for near-correct interpretation of images. In this paper, we present a method for classifying facial expressions based on the extracted features of facial components. The face, the window to the inner self of an individual can be analyzed for outright expressions like sadness, happiness, anger, surprise, disgust and fear. The facial region is detected; pre-processing is done on the image by using Active Appearance Model (AAM) to extract the vital feature on the facial components. Six classes are formed based on the different expressions and the model from the AAM procedure used to compare with the query image/face. Mahalanobis distance algorithm is used to classify the image in question into the best-fit class. Japanese Female Facial Expression (JAFFE) public database is used to evaluate our method with over 200 images of still images used in our experiment. A higher classification rate observed.