TY - GEN
T1 - Gender and emotion recognition with implicit user signals
AU - Bilalpur, Maneesh
AU - Kia, Seyed Mostafa
AU - Chawla, Manisha
AU - Chua, Tat Seng
AU - Subramanian, Ramanathan
N1 - Funding Information:
This study is partly supported by the Human Centered Cyber-physical Systems research grant from Singapore’s Agency for Science, Technology and Research (A*STAR).
Publisher Copyright:
© 2017 Association for Computing Machinery.
Copyright:
Copyright 2018 Elsevier B.V., All rights reserved.
PY - 2017/11/3
Y1 - 2017/11/3
N2 - We examine the utility of implicit user behavioral signals captured using low-cost, off-the-shelf devices for anonymous gender and emotion recognition. A user study designed to examine male and female sensitivity to facial emotions confirms that females recognize (especially negative) emotions quicker and more accurately than men, mirroring prior findings. Implicit viewer responses in the form of EEG brain signals and eye movements are then examined for existence of (a) emotion and gender-specific patterns from event-related potentials (ERPs) and fixation distributions and (b) emotion and gender discriminability. Experiments reveal that (i) Gender and emotion-specific differences are observable from ERPs, (ii) multiple similarities exist between explicit responses gathered from users and their implicit behavioral signals, and (iii) Significantly above-chance (≈70%) gender recognition is achievable on comparing emotion-specific EEG responses- gender differences are encoded best for anger and disgust. Also, fairly modest valence (positive vs negative emotion) recognition is achieved with EEG and eye-based features.
AB - We examine the utility of implicit user behavioral signals captured using low-cost, off-the-shelf devices for anonymous gender and emotion recognition. A user study designed to examine male and female sensitivity to facial emotions confirms that females recognize (especially negative) emotions quicker and more accurately than men, mirroring prior findings. Implicit viewer responses in the form of EEG brain signals and eye movements are then examined for existence of (a) emotion and gender-specific patterns from event-related potentials (ERPs) and fixation distributions and (b) emotion and gender discriminability. Experiments reveal that (i) Gender and emotion-specific differences are observable from ERPs, (ii) multiple similarities exist between explicit responses gathered from users and their implicit behavioral signals, and (iii) Significantly above-chance (≈70%) gender recognition is achievable on comparing emotion-specific EEG responses- gender differences are encoded best for anger and disgust. Also, fairly modest valence (positive vs negative emotion) recognition is achieved with EEG and eye-based features.
KW - EEG
KW - Eye movements
KW - Facial emotion processing
KW - Gender and emotion recognition
KW - Gender differences
KW - Implicit behavioral signals
UR - http://www.scopus.com/inward/record.url?scp=85046442440&partnerID=8YFLogxK
U2 - 10.1145/3136755.3136790
DO - 10.1145/3136755.3136790
M3 - Conference contribution
AN - SCOPUS:85046442440
SN - 9781450355438
T3 - ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction
SP - 379
EP - 387
BT - ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction
A2 - Lank, Edward
A2 - Hoggan, Eve
A2 - Subramanian, Sriram
A2 - Vinciarelli, Alessandro
A2 - Brewster, Stephen A.
PB - Association for Computing Machinery (ACM)
CY - United States
T2 - 19th ACM International Conference on Multimodal Interaction, ICMI 2017
Y2 - 13 November 2017 through 17 November 2017
ER -