Skip to Main Content
The face conveys information about a person's age, sex, background, and identity; what they are feeling, thinking, or likely to do next. Facial expression regulates face-to-face interactions, indicates reciprocity and interpersonal attraction or repulsion, and enables intersubjectivity between members of different cultures. Facial expression indexes neurological and psychiatric functioning and reveals personality and socioemotional development. Not surprisingly, the face has been of keen interest to behavioral scientists. About 15 years ago, computer scientists became increasingly interested in the use of computer vision and graphics to automatically analyze and synthesize facial expression. This effort was made possible in part by the development in psychology of detailed coding systems for describing facial actions and their relation to basic emotions, that is, emotions that are interpreted similarly in diverse cultures. The most detailed of these systems, the Facial Action Coding System (FACS), informed the development of the MPEG-4 facial animation parameters for video transmission and enabled progress toward automated measurement and synthesis of facial actions for research in affective computing, social signal processing, and behavioral science. This article reports key advances in behavioral science that are becoming possible through these developments. Before beginning, automated facial image analysis and synthesis (AFAS) is briefly described.