Skip to Main Content
In order to achieve subject-independent facial feature detection and extraction and obtain robustness against illumination variety, a novel method of combining integral projection and Gabor transformation is presented in this paper. First, to avoid manually picked expression features, we employ binary image and gray-level integral projection to detect and locate the exact position of human facial features automatically. Second, we segment the extracted areas into small cells for 7times7 pixels each and apply Gabor transformation on each cell. This greatly reduces the execution time of the Gabor transformation while retaining important information. Finally, a support vector machine is used for classifying facial emotions and when tested on the JAFFE database, the method has achieved a high recognition rate of 94%.