In this paper, a two stage facial expression rating and classification technique for facial is proposed, as the intensity of an emotion evolves from neutral to high expression. The face is modeled as a combination of sectors and their boundaries. An expression change in a face is characterised and quantified through a combination of non-rigid deformations. After elastic interpolation, this yields a geometry-based high-dimensional 2D shape transformation, which is used to register regions defined on query-faces. This shape transformation produces a vector-valued deformation field and is used to define a scalar valued Sector Volumetric Difference (SVD) function, which characterises and quantifies the facial expression. A two-stage expression classification is used with first stage detecting low, medim and high levels of expressions, and the second stage involving a HMM-classifier for recognizing six different facial emotions-anger, disgust, fear, happiness, sadness and surprise. Further, the proposed shape transformation approach is compared with marker based extraction method for extracting facial expression features. The performance evaluation done on a Italian audiovisual emotion database DaFex[l,2], comprising facial expression data from several actors eliciting five different emotions - anger, disgust, fear, happiness, sadness and surprise at different intensities (low, medium and high), shows a significant improvement in expression classification for the proposed shape-transfonnation approach.
|Title of host publication||Proceedings of the 2008 2nd International Conference on Signal Processing and Communication Systems (ICSPCS 2008)|
|Publication status||Published - 15 Dec 2008|
|Event||2nd International Conference on Signal Processing and Communication Systems, ICSPCS 2008 - Gold Coast, QLD, Australia|
Duration: 15 Dec 2008 → 17 Dec 2008
|Conference||2nd International Conference on Signal Processing and Communication Systems, ICSPCS 2008|
|City||Gold Coast, QLD|
|Period||15/12/08 → 17/12/08|