Investigating a two stage facial expression rating and classification technique

Research output: A Conference proceeding or a Chapter in BookConference contribution

1 Citation (Scopus)
1 Downloads (Pure)

Abstract

In this paper, a two stage facial expression rating and classification technique for facial is proposed, as the intensity of an emotion evolves from neutral to high expression. The face is modeled as a combination of sectors and their boundaries. An expression change in a face is characterised and quantified through a combination of non-rigid deformations. After elastic interpolation, this yields a geometry-based high-dimensional 2D shape transformation, which is used to register regions defined on query-faces. This shape transformation produces a vector-valued deformation field and is used to define a scalar valued Sector Volumetric Difference (SVD) function, which characterises and quantifies the facial expression. A two-stage expression classification is used with first stage detecting low, medim and high levels of expressions, and the second stage involving a HMM-classifier for recognizing six different facial emotions-anger, disgust, fear, happiness, sadness and surprise. Further, the proposed shape transformation approach is compared with marker based extraction method for extracting facial expression features. The performance evaluation done on a Italian audiovisual emotion database DaFex[l,2], comprising facial expression data from several actors eliciting five different emotions - anger, disgust, fear, happiness, sadness and surprise at different intensities (low, medium and high), shows a significant improvement in expression classification for the proposed shape-transfonnation approach.

Original languageEnglish
Title of host publicationProceedings of the 2008 2nd International Conference on Signal Processing and Communication Systems (ICSPCS 2008)
PublisherIEEE
ISBN (Print)9781424442423
DOIs
Publication statusPublished - 15 Dec 2008
Externally publishedYes
Event2nd International Conference on Signal Processing and Communication Systems, ICSPCS 2008 - Gold Coast, QLD, Australia
Duration: 15 Dec 200817 Dec 2008

Conference

Conference2nd International Conference on Signal Processing and Communication Systems, ICSPCS 2008
CountryAustralia
CityGold Coast, QLD
Period15/12/0817/12/08

Fingerprint

facial expression
emotion
rating
happiness
anger
anxiety
Interpolation
Classifiers
Geometry
mathematics
evaluation
performance

Cite this

Chetty, G. (2008). Investigating a two stage facial expression rating and classification technique. In Proceedings of the 2008 2nd International Conference on Signal Processing and Communication Systems (ICSPCS 2008) [4813754] IEEE. https://doi.org/10.1109/ICSPCS.2008.4813754
Chetty, Girija. / Investigating a two stage facial expression rating and classification technique. Proceedings of the 2008 2nd International Conference on Signal Processing and Communication Systems (ICSPCS 2008). IEEE, 2008.
@inproceedings{9e8ed4a3730b4c5382517d209cd5a129,
title = "Investigating a two stage facial expression rating and classification technique",
abstract = "In this paper, a two stage facial expression rating and classification technique for facial is proposed, as the intensity of an emotion evolves from neutral to high expression. The face is modeled as a combination of sectors and their boundaries. An expression change in a face is characterised and quantified through a combination of non-rigid deformations. After elastic interpolation, this yields a geometry-based high-dimensional 2D shape transformation, which is used to register regions defined on query-faces. This shape transformation produces a vector-valued deformation field and is used to define a scalar valued Sector Volumetric Difference (SVD) function, which characterises and quantifies the facial expression. A two-stage expression classification is used with first stage detecting low, medim and high levels of expressions, and the second stage involving a HMM-classifier for recognizing six different facial emotions-anger, disgust, fear, happiness, sadness and surprise. Further, the proposed shape transformation approach is compared with marker based extraction method for extracting facial expression features. The performance evaluation done on a Italian audiovisual emotion database DaFex[l,2], comprising facial expression data from several actors eliciting five different emotions - anger, disgust, fear, happiness, sadness and surprise at different intensities (low, medium and high), shows a significant improvement in expression classification for the proposed shape-transfonnation approach.",
keywords = "Facial expression recognition",
author = "Girija Chetty",
year = "2008",
month = "12",
day = "15",
doi = "10.1109/ICSPCS.2008.4813754",
language = "English",
isbn = "9781424442423",
booktitle = "Proceedings of the 2008 2nd International Conference on Signal Processing and Communication Systems (ICSPCS 2008)",
publisher = "IEEE",

}

Chetty, G 2008, Investigating a two stage facial expression rating and classification technique. in Proceedings of the 2008 2nd International Conference on Signal Processing and Communication Systems (ICSPCS 2008)., 4813754, IEEE, 2nd International Conference on Signal Processing and Communication Systems, ICSPCS 2008, Gold Coast, QLD, Australia, 15/12/08. https://doi.org/10.1109/ICSPCS.2008.4813754

Investigating a two stage facial expression rating and classification technique. / Chetty, Girija.

Proceedings of the 2008 2nd International Conference on Signal Processing and Communication Systems (ICSPCS 2008). IEEE, 2008. 4813754.

Research output: A Conference proceeding or a Chapter in BookConference contribution

TY - GEN

T1 - Investigating a two stage facial expression rating and classification technique

AU - Chetty, Girija

PY - 2008/12/15

Y1 - 2008/12/15

N2 - In this paper, a two stage facial expression rating and classification technique for facial is proposed, as the intensity of an emotion evolves from neutral to high expression. The face is modeled as a combination of sectors and their boundaries. An expression change in a face is characterised and quantified through a combination of non-rigid deformations. After elastic interpolation, this yields a geometry-based high-dimensional 2D shape transformation, which is used to register regions defined on query-faces. This shape transformation produces a vector-valued deformation field and is used to define a scalar valued Sector Volumetric Difference (SVD) function, which characterises and quantifies the facial expression. A two-stage expression classification is used with first stage detecting low, medim and high levels of expressions, and the second stage involving a HMM-classifier for recognizing six different facial emotions-anger, disgust, fear, happiness, sadness and surprise. Further, the proposed shape transformation approach is compared with marker based extraction method for extracting facial expression features. The performance evaluation done on a Italian audiovisual emotion database DaFex[l,2], comprising facial expression data from several actors eliciting five different emotions - anger, disgust, fear, happiness, sadness and surprise at different intensities (low, medium and high), shows a significant improvement in expression classification for the proposed shape-transfonnation approach.

AB - In this paper, a two stage facial expression rating and classification technique for facial is proposed, as the intensity of an emotion evolves from neutral to high expression. The face is modeled as a combination of sectors and their boundaries. An expression change in a face is characterised and quantified through a combination of non-rigid deformations. After elastic interpolation, this yields a geometry-based high-dimensional 2D shape transformation, which is used to register regions defined on query-faces. This shape transformation produces a vector-valued deformation field and is used to define a scalar valued Sector Volumetric Difference (SVD) function, which characterises and quantifies the facial expression. A two-stage expression classification is used with first stage detecting low, medim and high levels of expressions, and the second stage involving a HMM-classifier for recognizing six different facial emotions-anger, disgust, fear, happiness, sadness and surprise. Further, the proposed shape transformation approach is compared with marker based extraction method for extracting facial expression features. The performance evaluation done on a Italian audiovisual emotion database DaFex[l,2], comprising facial expression data from several actors eliciting five different emotions - anger, disgust, fear, happiness, sadness and surprise at different intensities (low, medium and high), shows a significant improvement in expression classification for the proposed shape-transfonnation approach.

KW - Facial expression recognition

UR - http://www.scopus.com/inward/record.url?scp=67649653646&partnerID=8YFLogxK

U2 - 10.1109/ICSPCS.2008.4813754

DO - 10.1109/ICSPCS.2008.4813754

M3 - Conference contribution

SN - 9781424442423

BT - Proceedings of the 2008 2nd International Conference on Signal Processing and Communication Systems (ICSPCS 2008)

PB - IEEE

ER -

Chetty G. Investigating a two stage facial expression rating and classification technique. In Proceedings of the 2008 2nd International Conference on Signal Processing and Communication Systems (ICSPCS 2008). IEEE. 2008. 4813754 https://doi.org/10.1109/ICSPCS.2008.4813754