Exploring eye activity as an indication of emotional states using an eye-tracking sensor

Sharifa Alghowinem, Majdah Alshehri, Roland Goecke, Michael Wagner

    Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

    43 Citations (Scopus)

    Abstract

    The automatic detection of human emotional states has been of great interest lately for its applications not only in the Human-Computer Interaction field, but also for its applications in psychological studies. Using an emotion elicitation paradigm, we investigate whether eye activity holds discriminative power for detecting affective states. Our emotion elicitation paradigm includes induced emotions by watching emotional movie clips and spontaneous emotions elicited by interviewing participants about emotional events in their life. To reduce gender variability, the selected participants were 60 female native Arabic speakers (30 young adults, and 30 mature adults). In general, the automatic classification results using eye activity were reasonable, giving 66% correct recognition rate on average. Statistical measures show statistically significant differences in eye activity patterns between positive and negative emotions. We conclude that eye activity, including eye movement, pupil dilation and pupil invisibility could be used as a complementary cues for the automatic recognition of human emotional states.

    Original languageEnglish
    Title of host publicationIntelligent Systems for Science and Information
    Subtitle of host publicationExtended and Selected Results from the Science and Information Conference 2013
    PublisherSpringer Verlag
    Pages261-276
    Number of pages16
    ISBN (Print)9783319047010
    DOIs
    Publication statusPublished - 2014

    Publication series

    NameStudies in Computational Intelligence
    Volume542
    ISSN (Print)1860-949X

    Fingerprint

    Dive into the research topics of 'Exploring eye activity as an indication of emotional states using an eye-tracking sensor'. Together they form a unique fingerprint.

    Cite this