TY - GEN
T1 - Exploring eye activity as an indication of emotional states using an eye-tracking sensor
AU - Alghowinem, Sharifa
AU - Alshehri, Majdah
AU - GOECKE, Roland
AU - WAGNER, Michael
PY - 2014
Y1 - 2014
N2 - The automatic detection of human emotional states has been of great interest lately for its applications not only in the Human-Computer Interaction field, but also for its applications in psychological studies. Using an emotion elicitation paradigm, we investigate whether eye activity holds discriminative power for detecting affective states. Our emotion elicitation paradigm includes induced emotions by watching emotional movie clips and spontaneous emotions elicited by interviewing participants about emotional events in their life. To reduce gender variability, the selected participants were 60 female native Arabic speakers (30 young adults, and 30 mature adults). In general, the automatic classification results using eye activity were reasonable, giving 66% correct recognition rate on average. Statistical measures show statistically significant differences in eye activity patterns between positive and negative emotions. We conclude that eye activity, including eye movement, pupil dilation and pupil invisibility could be used as a complementary cues for the automatic recognition of human emotional states.
AB - The automatic detection of human emotional states has been of great interest lately for its applications not only in the Human-Computer Interaction field, but also for its applications in psychological studies. Using an emotion elicitation paradigm, we investigate whether eye activity holds discriminative power for detecting affective states. Our emotion elicitation paradigm includes induced emotions by watching emotional movie clips and spontaneous emotions elicited by interviewing participants about emotional events in their life. To reduce gender variability, the selected participants were 60 female native Arabic speakers (30 young adults, and 30 mature adults). In general, the automatic classification results using eye activity were reasonable, giving 66% correct recognition rate on average. Statistical measures show statistically significant differences in eye activity patterns between positive and negative emotions. We conclude that eye activity, including eye movement, pupil dilation and pupil invisibility could be used as a complementary cues for the automatic recognition of human emotional states.
KW - Emotion Recognition
KW - Eye Tracking
UR - http://www.scopus.com/inward/record.url?scp=84958553761&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-04702-7_15
DO - 10.1007/978-3-319-04702-7_15
M3 - Conference contribution
SN - 9783319047010
VL - 542
T3 - Studies in Computational Intelligence
SP - 261
EP - 276
BT - Intelligent Systems for Science and Information
A2 - Chen, Liming
A2 - Kapoor, Supriya
A2 - Bhatia, Rahul
PB - Springer
CY - Germany
T2 - Intelligent Systems for Science and Information 2014
Y2 - 27 August 2014 through 29 August 2014
ER -