TY - GEN
T1 - Our emotions as seen through a webcam
AU - Sommer, Natalie
AU - Hirshfield, Leanne
AU - Velipasalar, Senem
PY - 2014
Y1 - 2014
N2 - Humanity's desire to enable machines to "understand" us drives research that seeks to uncover the mysteries of human beings and of their reactions. That is because a computer's ability to correctly classify our emotions will lead to an enhanced experience for a user. Making use of the eye of the computer, a webcam, we can acquire human reaction data through the acquisition of facial images in response to stimuli. The data of interest in this research are changes in pupil size and gaze patterns in conjunction with classification of facial expression. Although fusion of these measurements has been considered in the past by Xiang and Kankanhalli [14] as well as Valverde et al. [15], their approach was quite different from ours. Both groups used a multimodal set-up: an eye tracker alongside a webcam and the stimulus was visual. A novel approach is to avoid costly eye trackers and rely on images acquired only from a standard webcam to measure changes in pupil size, gaze patterns and facial expression in response to auditory stimuli. The auditory mode is often preferred since luminance does not need to be accounted for, unlike visual stimulation from a monitor. The fusion of the information from these features is then used to distinguish between negative, neutral and positive emotional states. In this paper we discuss an experiment (n = 15) where the stimuli from the auditory version of the international affective picture system (IAPS) are used to elicit these three main emotions in participants. Webcam data is recorded during the experiments and advanced signal processing and feature extraction techniques are used on the resulting image files to achieve a model capable of predicting neutral, positive, and negative emotional states.
AB - Humanity's desire to enable machines to "understand" us drives research that seeks to uncover the mysteries of human beings and of their reactions. That is because a computer's ability to correctly classify our emotions will lead to an enhanced experience for a user. Making use of the eye of the computer, a webcam, we can acquire human reaction data through the acquisition of facial images in response to stimuli. The data of interest in this research are changes in pupil size and gaze patterns in conjunction with classification of facial expression. Although fusion of these measurements has been considered in the past by Xiang and Kankanhalli [14] as well as Valverde et al. [15], their approach was quite different from ours. Both groups used a multimodal set-up: an eye tracker alongside a webcam and the stimulus was visual. A novel approach is to avoid costly eye trackers and rely on images acquired only from a standard webcam to measure changes in pupil size, gaze patterns and facial expression in response to auditory stimuli. The auditory mode is often preferred since luminance does not need to be accounted for, unlike visual stimulation from a monitor. The fusion of the information from these features is then used to distinguish between negative, neutral and positive emotional states. In this paper we discuss an experiment (n = 15) where the stimuli from the auditory version of the international affective picture system (IAPS) are used to elicit these three main emotions in participants. Webcam data is recorded during the experiments and advanced signal processing and feature extraction techniques are used on the resulting image files to achieve a model capable of predicting neutral, positive, and negative emotional states.
UR - http://www.scopus.com/inward/record.url?scp=84958521504&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84958521504&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-07527-3_8
DO - 10.1007/978-3-319-07527-3_8
M3 - Conference contribution
AN - SCOPUS:84958521504
SN - 9783319075266
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 78
EP - 89
BT - Foundations of Augmented Cognition
PB - Springer Verlag
T2 - 8th International Conference on Augmented Cognition, AC 2014 - Held as Part of 16th International Conference on Human-Computer Interaction, HCI International 2014
Y2 - 22 June 2014 through 27 June 2014
ER -