Abstract
Affect, encompassing the broad spectrum of feeling, emotion, and mood, is a key component of the overall human experience. It plays a crucial role in shaping behaviour, cognition, and interpersonal interactions. Human mood and emotion are two distinct yet interconnected affective states. While mood is an enduring affective state that is not necessarily tied to a specific stimulus, emotion is a relatively brief and intense affective state that is often triggered by specific stimuli or events. Whilst the interplay between mood and emotion is firmly established from a psychological perspective, very few studies have explored mood inference and modelling of the mood-emotion interplay from a computational viewpoint. Even among the affective computing community, the predominant focus has been on categorical and dimensional emotion inference, and very few studies have focused on examining mood using computational approaches.The broader aim of this thesis is to jointly model human mood and emotion to understand the interplay between the two affective states, and to explore if such a joint modelling will enhance automatic affect inference performance. Specifically, the research described in this thesis aims to investigate a joint spatio-temporal modelling approach of human mood and emotion using facial video data. The aim is to perform mood inference and investigate the interplay between mood and emotion leveraging deep learning algorithms. To this end, (a) emotion-change information, or the difference in valence, is used in addition to mood labels for examining mood (Chapter 4); (b) the influence of the integration of neural attention mechanism modules on various model architectures is investigated (Chapter 5); (c) obviating the need for ground-truth emotion labels, weak supervision is employed to deduce pseudo emotion-similarity labels for inferring mood (Chapter 6); and (d) to model the converse perspective, emotion inference is performed using mood as a contextual cue (Chapter 7).
The research findings indicate that jointly modelling mood and emotion yields an improvement in affect prediction performance. The experimental results demonstrate that incorporating emotion-change information is beneficial for mood inference. While the integration of neural attention mechanism enhances mood prediction performance, using emotion-change information further improves the inference accuracy. Similar trends in mood prediction performance are achieved by employing weak supervision for deducing emotional similarity. Furthermore, mood as a contextual cue enhances emotion inference performance, rounding off the mood-emotion circle.
Date of Award | 2024 |
---|---|
Original language | English |
Supervisor | Ramanathan Subramanian (Supervisor), Roland GOECKE (Supervisor) & Ibrahim RADWAN (Supervisor) |