Recognition of Advertisement Emotions with Application to Computational Advertising

Abhinav Shukla, Shruti Shriya Gullapuram, Harish Katti, Mohan Kankanhalli, Stefan Winkler, Ramanathan Subramanian

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Advertisements (ads) often contain strong emotions to capture audience attention and convey an effective message. Still, little work has focused on affect recognition (AR) from ads employing audiovisual or user cues. This work (1) compiles an affective video ad dataset which evokes coherent emotions across users; (2) explores the efficacy of content-centric convolutional neural network (CNN) features for ad AR vis-ã-vis handcrafted audio-visual descriptors; (3) examines user-centric ad AR from Electroencephalogram (EEG) signals, and (4) demonstrates how better affect predictions facilitate effective computational advertising via a study involving 18 users. Experiments reveal that (a) CNN features outperform handcrafted audiovisual descriptors for content-centric AR; (b) EEG features encode ad-induced emotions better than contentbased features; (c) Multi-task learning achieves optimal ad AR among a slew of classifiers and (d) Pursuant to (b), EEG features enable optimized ad insertion onto streamed video compared to content-based or manual insertion, maximizing ad recall and viewing experience.

Original languageEnglish
Pages (from-to)1-12
Number of pages12
JournalIEEE Transactions on Affective Computing
DOIs
Publication statusE-pub ahead of print - 2020
Externally publishedYes

Fingerprint Dive into the research topics of 'Recognition of Advertisement Emotions with Application to Computational Advertising'. Together they form a unique fingerprint.

Cite this