Analysis of Head and Facial Gestures Using Facial Landmark Trajectories

  • Hatice Cinar Akakin
  • Bulent Sankur
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5707)

Abstract

Automatic analysis of head and facial gestures is a significant and challenging research area for human-computer interfaces. We propose a robust face-and head gesture analyzer. The analyzer exploits trajectories of facial landmark positions during the course of the head gesture or facial expression. The trajectories themselves are obtained as the output of an accurate feature detector and tracker algorithm, which uses a combination of appearance- and model-based approaches. A multi-pose deformable shape model is trained in order to handle shape variations under varying head rotations and facial expressions. Discriminative observation symbols extracted from the landmark trajectories drive a continuous HMM with mixture of Gaussian outputs and is used to recognize a subset of head gestures and facial expressions. For seven gesture classes we achieve 86.4 % recognition rate.

Keywords

Automatic facial feature tracking head gesture and facial expression recognition 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Feris, R., Cesar Junior, R.M.: Tracking Facial Features Using Gabor Wavelet Networks, pp. 22–27. IEEE, Sibgraphi (2000)Google Scholar
  2. 2.
    McKenna, S., Gong, R.P., Wurtz, J., Tanner, D.: Tracking facial feature points with Gabor wavelets and shape models. In: Proceedings of the International Conference on Audio-and Video-based Biometric Person Authentication (1997)Google Scholar
  3. 3.
    Cootes, T.F., Taylor, C.J., Cooper, D.H., Graham, J.: Active shape models - their training and application. Computer Vision Image Understanding 61(1), 38–59 (1995)CrossRefGoogle Scholar
  4. 4.
    Cootes, T.F., Edwards, G.J., Taylor, C.J.: Active appearance models. IEEE Trans. PAMI 23(6), 681–684 (2001)CrossRefGoogle Scholar
  5. 5.
    Dornaika, F., Davoine, F.: Online Appearance-based Face and Facial Feature Tracking. In: Proceedings of the 17th International Conference on Pattern Recognition (2004)Google Scholar
  6. 6.
    Cristinacce, D., Cootes, T.F.: Facial Feature Detection and Tracking with Automatic Template election. In: Int. Conf. on Automatic Face and Gesture Recognition, FGR (2006)Google Scholar
  7. 7.
    Çınar Akakın, H., Akarun, L., Sankur, B.: 2D/3D Face Landmarking. 3DTV Con., Kos (2007)Google Scholar
  8. 8.
    Kanaujia, A., Huang, Y., Metaxas, D.: Emblem Detections by Tracking Facial Features. In: Proceedings of the Conference on Computer Vision and Pattern Recognition Workshop, CVPRW (2006)Google Scholar
  9. 9.
    Tong, Y., Wang, Y., Zhu, Z., Ji, Q.: Robust facial feature tracking under varying face pose and facial expression. Pattern Recognition 40, 3195–3208 (2007)CrossRefMATHGoogle Scholar
  10. 10.
    Aran, O., Arı, İ., Güvensan, M.A., Haberdar, H., Kurt, Z., Türkmen, H.İ., Uyar, A., Akarun, L.: A Database of Non-Manual Signs in Turkish Sign Language. In: IEEE Signal Processing and Communications Applications (SIU 2007), Eskişehir (2007)Google Scholar
  11. 11.
    Savran, A., Alyüz, N., Dibeklioğlu, H., Çeliktutan, O., Gökberk, B., Sankur, B., Akarun, L.: Bosphorus database for 3D face analysis. In: Schouten, B., Juul, N.C., Drygajlo, A., Tistarelli, M. (eds.) BIOID 2008. LNCS, vol. 5372, pp. 47–56. Springer, Heidelberg (2008), http://www.busim.ee.boun.edu.tr/~bosphorus/ CrossRefGoogle Scholar
  12. 12.
    Wang, T.H., James Lien, J.J.: Facial expression recognition system based on rigid and non-rigid motion separation and 3D pose estimation. Pattern Recognition 42, 96–977 (2009)Google Scholar
  13. 13.
    Murphy, K.: Hidden Markov Model (HMM) Toolbox for Matlab, http://www.cs.ubc.ca/~murphyk/Software/HMM/hmm.html
  14. 14.
    Arı, İ., Akarun, L.: Facial Feature Tracking and Expression Recognition for Sign Language. In: IEEE, Signal Processing and Communications Applications, Antalya (2009)Google Scholar
  15. 15.
    Cooper, K.: Nonverbal Communication for Business Success. Amacom (January 1979)Google Scholar
  16. 16.
    Bailenson, J.N., et al.: Real-time classification of evoked emotions using facial feature tracking and physiological responses. International Journal of Human Machine Studies 66, 303–317 (2008)CrossRefGoogle Scholar
  17. 17.
    Kaliouby, R.A.: Mind-reading machines: automated inference of complex mental states. Tech. Report, UCAM-CL-TR-636 (July 2005)Google Scholar
  18. 18.
    Ekman, P., Friesen, W.V.: Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press (1978)Google Scholar
  19. 19.
    Demirkir, C., Sankur, B.: Face Detection Using Look-up Table Based Gentle AdaBoost. In: Audio and Video-based Biometric Person Authentication (AVBPA), Terrytown, New York (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Hatice Cinar Akakin
    • 1
  • Bulent Sankur
    • 1
  1. 1.Electrical and Electronics Engineering DepartmentBogazici UniversityIstanbulTurkey

Personalised recommendations