A novel shape transformation approach for quantizing facial expressions

    Research output: A Conference proceeding or a Chapter in BookConference contribution

    Abstract

    In this paper, a novel methodology for facial expression rating is proposed, as the intensity of an emotion evolves from neutral to high expression. The face is modeled as a combination of sectors and their boundaries. An expression change in a face is characterised and quantified through a combination of non-rigid deformations. After elastic interpolation, this yields a geometry-based high-dimensional 2D shape transformation, which is used to register regions defined on query-faces. This shape transformation produces a vector-valued deformation field and is used to define a scalar valued Sector Volumetric Difference (SVD) function, which characterises and quantifies the facial expression. A two-stage expression classification is used with first stage detecting low, medim and high levels of expressions, and the second stage involving a HMM-classifier for recognizing six different facial emotions- anger, disgust, fear, happiness, sadness and surprise. Further, the proposed shape transformation approach is compared with marker based extraction method for extracting facial expression features. The performance evaluation done on a Italian audiovisual emotion database DaFex comprising facial expression data from several actors eliciting five different emotions - anger, disgust, fear, happiness, sadness and surprise at different intensities (low, medium and high), shows a significant improvement in expression classification for the proposed shape-transformation approach
    Original languageEnglish
    Title of host publicationProceedings Digital Image Computing: Techniques and Applications DICTA 2008
    EditorsAntonio Robles-Kelly
    Place of PublicationUnited States
    PublisherIEEE, Institute of Electrical and Electronics Engineers
    Pages168-175
    Number of pages8
    ISBN (Print)9780769534565
    DOIs
    Publication statusPublished - 2008
    EventDigital Image Computing: Techniques and Applications DICTA 2008 - Canberra, Australia
    Duration: 1 Dec 20083 Dec 2008

    Conference

    ConferenceDigital Image Computing: Techniques and Applications DICTA 2008
    CountryAustralia
    CityCanberra
    Period1/12/083/12/08

    Fingerprint

    Interpolation
    Classifiers
    Geometry

    Cite this

    Chetty, G. (2008). A novel shape transformation approach for quantizing facial expressions. In A. Robles-Kelly (Ed.), Proceedings Digital Image Computing: Techniques and Applications DICTA 2008 (pp. 168-175). United States: IEEE, Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/DICTA.2008.96
    Chetty, Girija. / A novel shape transformation approach for quantizing facial expressions. Proceedings Digital Image Computing: Techniques and Applications DICTA 2008. editor / Antonio Robles-Kelly. United States : IEEE, Institute of Electrical and Electronics Engineers, 2008. pp. 168-175
    @inproceedings{fa5e669d12004f6b8333fc29a6683aea,
    title = "A novel shape transformation approach for quantizing facial expressions",
    abstract = "In this paper, a novel methodology for facial expression rating is proposed, as the intensity of an emotion evolves from neutral to high expression. The face is modeled as a combination of sectors and their boundaries. An expression change in a face is characterised and quantified through a combination of non-rigid deformations. After elastic interpolation, this yields a geometry-based high-dimensional 2D shape transformation, which is used to register regions defined on query-faces. This shape transformation produces a vector-valued deformation field and is used to define a scalar valued Sector Volumetric Difference (SVD) function, which characterises and quantifies the facial expression. A two-stage expression classification is used with first stage detecting low, medim and high levels of expressions, and the second stage involving a HMM-classifier for recognizing six different facial emotions- anger, disgust, fear, happiness, sadness and surprise. Further, the proposed shape transformation approach is compared with marker based extraction method for extracting facial expression features. The performance evaluation done on a Italian audiovisual emotion database DaFex comprising facial expression data from several actors eliciting five different emotions - anger, disgust, fear, happiness, sadness and surprise at different intensities (low, medium and high), shows a significant improvement in expression classification for the proposed shape-transformation approach",
    author = "Girija Chetty",
    year = "2008",
    doi = "10.1109/DICTA.2008.96",
    language = "English",
    isbn = "9780769534565",
    pages = "168--175",
    editor = "Antonio Robles-Kelly",
    booktitle = "Proceedings Digital Image Computing: Techniques and Applications DICTA 2008",
    publisher = "IEEE, Institute of Electrical and Electronics Engineers",
    address = "United States",

    }

    Chetty, G 2008, A novel shape transformation approach for quantizing facial expressions. in A Robles-Kelly (ed.), Proceedings Digital Image Computing: Techniques and Applications DICTA 2008. IEEE, Institute of Electrical and Electronics Engineers, United States, pp. 168-175, Digital Image Computing: Techniques and Applications DICTA 2008, Canberra, Australia, 1/12/08. https://doi.org/10.1109/DICTA.2008.96

    A novel shape transformation approach for quantizing facial expressions. / Chetty, Girija.

    Proceedings Digital Image Computing: Techniques and Applications DICTA 2008. ed. / Antonio Robles-Kelly. United States : IEEE, Institute of Electrical and Electronics Engineers, 2008. p. 168-175.

    Research output: A Conference proceeding or a Chapter in BookConference contribution

    TY - GEN

    T1 - A novel shape transformation approach for quantizing facial expressions

    AU - Chetty, Girija

    PY - 2008

    Y1 - 2008

    N2 - In this paper, a novel methodology for facial expression rating is proposed, as the intensity of an emotion evolves from neutral to high expression. The face is modeled as a combination of sectors and their boundaries. An expression change in a face is characterised and quantified through a combination of non-rigid deformations. After elastic interpolation, this yields a geometry-based high-dimensional 2D shape transformation, which is used to register regions defined on query-faces. This shape transformation produces a vector-valued deformation field and is used to define a scalar valued Sector Volumetric Difference (SVD) function, which characterises and quantifies the facial expression. A two-stage expression classification is used with first stage detecting low, medim and high levels of expressions, and the second stage involving a HMM-classifier for recognizing six different facial emotions- anger, disgust, fear, happiness, sadness and surprise. Further, the proposed shape transformation approach is compared with marker based extraction method for extracting facial expression features. The performance evaluation done on a Italian audiovisual emotion database DaFex comprising facial expression data from several actors eliciting five different emotions - anger, disgust, fear, happiness, sadness and surprise at different intensities (low, medium and high), shows a significant improvement in expression classification for the proposed shape-transformation approach

    AB - In this paper, a novel methodology for facial expression rating is proposed, as the intensity of an emotion evolves from neutral to high expression. The face is modeled as a combination of sectors and their boundaries. An expression change in a face is characterised and quantified through a combination of non-rigid deformations. After elastic interpolation, this yields a geometry-based high-dimensional 2D shape transformation, which is used to register regions defined on query-faces. This shape transformation produces a vector-valued deformation field and is used to define a scalar valued Sector Volumetric Difference (SVD) function, which characterises and quantifies the facial expression. A two-stage expression classification is used with first stage detecting low, medim and high levels of expressions, and the second stage involving a HMM-classifier for recognizing six different facial emotions- anger, disgust, fear, happiness, sadness and surprise. Further, the proposed shape transformation approach is compared with marker based extraction method for extracting facial expression features. The performance evaluation done on a Italian audiovisual emotion database DaFex comprising facial expression data from several actors eliciting five different emotions - anger, disgust, fear, happiness, sadness and surprise at different intensities (low, medium and high), shows a significant improvement in expression classification for the proposed shape-transformation approach

    U2 - 10.1109/DICTA.2008.96

    DO - 10.1109/DICTA.2008.96

    M3 - Conference contribution

    SN - 9780769534565

    SP - 168

    EP - 175

    BT - Proceedings Digital Image Computing: Techniques and Applications DICTA 2008

    A2 - Robles-Kelly, Antonio

    PB - IEEE, Institute of Electrical and Electronics Engineers

    CY - United States

    ER -

    Chetty G. A novel shape transformation approach for quantizing facial expressions. In Robles-Kelly A, editor, Proceedings Digital Image Computing: Techniques and Applications DICTA 2008. United States: IEEE, Institute of Electrical and Electronics Engineers. 2008. p. 168-175 https://doi.org/10.1109/DICTA.2008.96