Our emotions as seen through a webcam

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Humanity's desire to enable machines to "understand" us drives research that seeks to uncover the mysteries of human beings and of their reactions. That is because a computer's ability to correctly classify our emotions will lead to an enhanced experience for a user. Making use of the eye of the computer, a webcam, we can acquire human reaction data through the acquisition of facial images in response to stimuli. The data of interest in this research are changes in pupil size and gaze patterns in conjunction with classification of facial expression. Although fusion of these measurements has been considered in the past by Xiang and Kankanhalli [14] as well as Valverde et al. [15], their approach was quite different from ours. Both groups used a multimodal set-up: an eye tracker alongside a webcam and the stimulus was visual. A novel approach is to avoid costly eye trackers and rely on images acquired only from a standard webcam to measure changes in pupil size, gaze patterns and facial expression in response to auditory stimuli. The auditory mode is often preferred since luminance does not need to be accounted for, unlike visual stimulation from a monitor. The fusion of the information from these features is then used to distinguish between negative, neutral and positive emotional states. In this paper we discuss an experiment (n = 15) where the stimuli from the auditory version of the international affective picture system (IAPS) are used to elicit these three main emotions in participants. Webcam data is recorded during the experiments and advanced signal processing and feature extraction techniques are used on the resulting image files to achieve a model capable of predicting neutral, positive, and negative emotional states.

Original languageEnglish (US)
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
PublisherSpringer Verlag
Pages78-89
Number of pages12
Volume8534 LNAI
ISBN (Print)9783319075266
DOIs
StatePublished - 2014
Event8th International Conference on Augmented Cognition, AC 2014 - Held as Part of 16th International Conference on Human-Computer Interaction, HCI International 2014 - Heraklion, Crete, Greece
Duration: Jun 22 2014Jun 27 2014

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume8534 LNAI
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other8th International Conference on Augmented Cognition, AC 2014 - Held as Part of 16th International Conference on Human-Computer Interaction, HCI International 2014
CountryGreece
CityHeraklion, Crete
Period6/22/146/27/14

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

Fingerprint Dive into the research topics of 'Our emotions as seen through a webcam'. Together they form a unique fingerprint.

  • Cite this

    Sommer, N., Hirshfield, L. M., & Velipasalar, S. (2014). Our emotions as seen through a webcam. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8534 LNAI, pp. 78-89). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 8534 LNAI). Springer Verlag. https://doi.org/10.1007/978-3-319-07527-3_8