Stereo 3D Lip Tracking

Gareth Loy, Roland GOECKE, Sebastian Rougeaux, Alexander Zelinsky

Research output: A Conference proceeding or a Chapter in BookConference contributionpeer-review


A system is presented that tracks in 3D a person's unadorned lips, and outputs the 3D locations of the mouth corners and ten points describing the outer lip contour. This output is suitable for audio visual speech processing, 3D animation, or expression recognition. A stereo head tracker is used to track the subject's head, allowing for robust performance whilst the subject's head is moving and turning with respect to the cameras. The head pose is used in conjunction with the novel adaptable templates to generate a robust estimate of the deforming mouth corner locations. A 3D geometric model is used to generate search paths for key points on the outer lip contour which are subsequently located using adaptable templates and geometric constraints. The system is demonstrated robustly tracking the head pose and 3D mouth shape on a person speaking while moving his head.
Original languageEnglish
Title of host publicationSixth International Coference on Control, Automation, Robotics and Vision Conference Proceedings (ICARCV 2000)
Subtitle of host publicationIntelligent Automation in the New Millennium
Place of PublicationSingapore
PublisherNanyang Technological University
Number of pages6
ISBN (Electronic)9810434456
Publication statusPublished - 5 Dec 2000
EventInternational Conference on Control, Automation, Robotics and Vision 2000 - Marina Mandarin, Singapore, Singapore
Duration: 5 Dec 20008 Dec 2000
Conference number: 6th


ConferenceInternational Conference on Control, Automation, Robotics and Vision 2000
Abbreviated titleICARCV 2000
Internet address


Dive into the research topics of 'Stereo 3D Lip Tracking'. Together they form a unique fingerprint.

Cite this