Skip to Main Content
 

Global Search Box

 
 
 
 

Files

ETD Abstract Container

Abstract Header

Computational Models of the Production and Perception of Facial Expressions

Abstract Details

2018, Doctor of Philosophy, Ohio State University, Electrical and Computer Engineering.
By combining different facial muscle actions, called Action Units (AUs), humans can produce an extraordinarily large number of facial expressions. Computational models and studies in cognitive science have long hypothesized the brain needs to visually interpret these action units to understand other people's actions and intentions. Surprisingly, no studies have identified the neural basis of the visual recognition of these action units. Here, using functional Magnetic Resonance Imaging (fMRI), we identify a consistent and differential coding of action units in the brain. Crucially, in a brain region thought to be responsible for the processing of changeable aspects of the face, pattern analysis could decode the presence of specific action units in an image. This coding was found to be consistent across people, facilitating the estimation of the perceived action units on participants not used to train the pattern analysis decoder. Research in face perception and emotion theory requires very large annotated databases of images of facial expressions of emotion. Useful annotations include AUs and their intensities, as well as emotion category. This process cannot be practically achieved manually. Herein, we present a novel computer vision algorithm to annotate a large database of a million images of facial expressions of emotion from the wild (i.e., face images downloaded from the Internet). We further use WordNet to download 1,000,000 images of facial expressions with associated emotion keywords from the Internet. The downloaded images are then automatically annotated with AUs, AU intensities and emotion categories by our algorithm. The result is a highly useful database that can be readily queried using semantic descriptions for applications in computer vision, affective computing, social and cognitive psychology. Color is a fundamental image feature of facial expressions. For example, when we furrow our eyebrows in anger, blood rushes in and a reddish color becomes apparent around that area of the face. Surprisingly, these image properties have not been exploited to recognize the facial action units (AUs) associated with these expressions. Herein, we present the first system to do recognition of AUs and their intensities using these functional color changes. These color features are shown to be robust to changes in identity, gender, race, ethnicity and skin color. Because these image changes are given by functions rather than vectors, we use a functional classifiers to identify the most discriminant color features of an AU and its intensities. We demonstrate that, using these discriminant color features, one can achieve results superior to those of the state-of-the-art. Lastly, the study of emotion has reached an impasse that can only be addressed once we know which facial expressions are used within and across cultures in the wild, not in controlled lab conditions. Yet, no such studies exist. Here, we present the first large-scale study of the production and visual perception of facial expressions of emotion in the wild. We find that of the 16,384 possible facial configurations that people can produce, only 35 are successfully used to transmit emotive information across cultures, and 8 within a smaller number of cultures.
Aleix Martinez (Advisor)
Julie Golomb (Committee Member)
Yuan Zheng (Committee Member)
147 p.

Recommended Citations

Citations

  • Srinivasan, R. (2018). Computational Models of the Production and Perception of Facial Expressions [Doctoral dissertation, Ohio State University]. OhioLINK Electronic Theses and Dissertations Center. http://rave.ohiolink.edu/etdc/view?acc_num=osu1531239299392184

    APA Style (7th edition)

  • Srinivasan, Ramprakash. Computational Models of the Production and Perception of Facial Expressions. 2018. Ohio State University, Doctoral dissertation. OhioLINK Electronic Theses and Dissertations Center, http://rave.ohiolink.edu/etdc/view?acc_num=osu1531239299392184.

    MLA Style (8th edition)

  • Srinivasan, Ramprakash. "Computational Models of the Production and Perception of Facial Expressions." Doctoral dissertation, Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1531239299392184

    Chicago Manual of Style (17th edition)