TY - GEN
T1 - Evaluating content-centric vs. user-centric ad affect recognition
AU - Shukla, Abhinav
AU - Gullapuram, Shruti Shriya
AU - Katti, Harish
AU - Yadati, Karthik
AU - Kankanhalli, Mohan
AU - Subramanian, Ramanathan
N1 - Funding Information:
This research is supported by the National Research Foundation, Prime Ministers Office, Singapore under its International Research Centre in Singapore Funding Initiative.
Publisher Copyright:
© 2017 ACM.
Copyright:
Copyright 2019 Elsevier B.V., All rights reserved.
PY - 2017/11/3
Y1 - 2017/11/3
N2 - Despite the fact that advertisements (ads) often include strongly emotional content, very little work has been devoted to affect recognition (AR) from ads. This work explicitly compares contentcentric and user-centric ad AR methodologies, and evaluates the impact of enhanced AR on computational advertising via a user study. Specifically, we (1) compile an affective ad dataset capable of evoking coherent emotions across users; (2) explore the efficacy of content-centric convolutional neural network (CNN) features for encoding emotions, and show that CNN features outperform low-level emotion descriptors; (3) examine user-centered ad AR by analyzing Electroencephalogram (EEG) responses acquired from eleven viewers, and find that EEG signals encode emotional information better than content descriptors; (4) investigate the relationship between objective AR and subjective viewer experience while watching an ad-embedded online video stream based on a study involving 12 users. To our knowledge, this is the first work to (a) expressly compare user vs content-centered AR for ads, and (b) study the relationship between modeling of ad emotions and its impact on a real-life advertising application.
AB - Despite the fact that advertisements (ads) often include strongly emotional content, very little work has been devoted to affect recognition (AR) from ads. This work explicitly compares contentcentric and user-centric ad AR methodologies, and evaluates the impact of enhanced AR on computational advertising via a user study. Specifically, we (1) compile an affective ad dataset capable of evoking coherent emotions across users; (2) explore the efficacy of content-centric convolutional neural network (CNN) features for encoding emotions, and show that CNN features outperform low-level emotion descriptors; (3) examine user-centered ad AR by analyzing Electroencephalogram (EEG) responses acquired from eleven viewers, and find that EEG signals encode emotional information better than content descriptors; (4) investigate the relationship between objective AR and subjective viewer experience while watching an ad-embedded online video stream based on a study involving 12 users. To our knowledge, this is the first work to (a) expressly compare user vs content-centered AR for ads, and (b) study the relationship between modeling of ad emotions and its impact on a real-life advertising application.
KW - Ads
KW - Affect recognition
KW - CNNs
KW - Computational advertising
KW - Content-centric vs user-centric
KW - EEG
KW - Multimodal analytics
UR - http://www.scopus.com/inward/record.url?scp=85046827139&partnerID=8YFLogxK
U2 - 10.1145/3136755.3136796
DO - 10.1145/3136755.3136796
M3 - Conference contribution
AN - SCOPUS:85046827139
T3 - ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction
SP - 402
EP - 410
BT - ICMI 2017 - Proceedings of the 19th ACM International Conference on Multimodal Interaction
A2 - Lank, Edward
A2 - Hoggan, Eve
A2 - Subramanian, Sriram
A2 - Vinciarelli, Alessandro
A2 - Brewster, Stephen A.
PB - Association for Computing Machinery (ACM)
CY - United States
T2 - 19th ACM International Conference on Multimodal Interaction, ICMI 2017
Y2 - 13 November 2017 through 17 November 2017
ER -