User-centric affective video tagging from MEG and peripheral physiological responses

Mojtaba Khomami Abadi, Seyed Mostafa Kia, Ramanathan Subramanian, Paolo Avesani, Nicu Sebe

Research output: A Conference proceeding or a Chapter in BookConference contributionpeer-review

15 Citations (Scopus)

Abstract

This paper presents a new multimodal database and the associated results for characterization of affect (valence, arousal and dominance) using the Magneto encephalogram (MEG) brain signals and peripheral physiological signals (horizontal EOG, ECG, trapezius EMG). We attempt single-trial classification of affect in movie and music video clips employing emotional responses extracted from eighteen participants. The main findings of this study are that: (i) the MEG signal effectively encodes affective viewer responses, (ii) clip arousal is better predicted by MEG, while peripheral physiological signals are more effective for predicting valence and (iii) prediction performance is better for movie clips as compared to music video clips.

Original languageEnglish
Title of host publicationProceedings - 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII 2013
EditorsAnton Nijholt, Maja Pantic, Sidney D'Mello
Place of PublicationUnited States
PublisherIEEE, Institute of Electrical and Electronics Engineers
Pages582-587
Number of pages6
ISBN (Print)9780769550480
DOIs
Publication statusPublished - 2013
Externally publishedYes
Event2013 5th Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII 2013 - Geneva, Switzerland
Duration: 2 Sept 20135 Sept 2013

Publication series

NameProceedings - 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII 2013

Conference

Conference2013 5th Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII 2013
Country/TerritorySwitzerland
CityGeneva
Period2/09/135/09/13

Fingerprint

Dive into the research topics of 'User-centric affective video tagging from MEG and peripheral physiological responses'. Together they form a unique fingerprint.

Cite this