Exploring transfer learning approaches for head pose classification from multi-view surveillance images

Anoop Kolar Rajagopal, Ramanathan Subramanian, Elisa Ricci, Radu L. Vieriu, Oswald Lanz, Ramakrishnan Kalpathi R., Nicu Sebe

Research output: Contribution to journalArticlepeer-review

37 Citations (Scopus)

Abstract

Head pose classification from surveillance images acquired with distant, large field-of-view cameras is difficult as faces are captured at low-resolution and have a blurred appearance. Domain adaptation approaches are useful for transferring knowledge from the training (source) to the test (target) data when they have different attributes, minimizing target data labeling efforts in the process. This paper examines the use of transfer learning for efficient multi-view head pose classification with minimal target training data under three challenging situations: (i) where the range of head poses in the source and target images is different, (ii) where source images capture a stationary person while target images capture a moving person whose facial appearance varies under motion due to changing perspective, scale and (iii) a combination of (i) and (ii). On the whole, the presented methods represent novel transfer learning solutions employed in the context of multi-view head pose classification. We demonstrate that the proposed solutions considerably outperform the state-of-the-art through extensive experimental validation. Finally, the DPOSE dataset compiled for benchmarking head pose classification performance with moving persons, and to aid behavioral understanding applications is presented in this work.

Original languageEnglish
Pages (from-to)146-167
Number of pages22
JournalInternational Journal of Computer Vision
Volume109
Issue number1-2
DOIs
Publication statusPublished - Aug 2014
Externally publishedYes

Fingerprint

Dive into the research topics of 'Exploring transfer learning approaches for head pose classification from multi-view surveillance images'. Together they form a unique fingerprint.

Cite this