Recognizing Facial Expression Using Particle Filter Based Feature Points Tracker

  • Rakesh Tripathi
  • R. Aravind
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4815)

Abstract

The paper focuses on an evaluation of particle filter based facial feature tracker. Particle filter is a successful tool in the non-linear and the non-Gaussian estimation problems. We developed a particle filter based facial points tracker with a simple observation model based on sum-of-squared differences (SSD) between the intensities. Multistate face component model is used to estimate the occluded feature points. The important distances are calculated from tracked points. Two kinds of classification schemes are considered, the hidden Markov model (HMM) as sequence based recognizer and support vector machine (SVM) as frame based recognizer. A comparative study is shown in the classification of five basic expressions, i.e., anger, sadness, happiness, surprise and disgust. The tests are conducted on Cohn-Kanade and MMI face expression databases.

Keywords

Support Vector Machine Facial Expression Hide Markov Model Feature Point Recognition Rate 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Yacoob, Y., Davis, L.S.: Recognizing Human Facial Expression from Long Image Sequences using Optical Flow. IEEE Trans. PAMI 18(6), 636–642 (1996)Google Scholar
  2. 2.
    Bartlett, M.S., Braathen, B., Littlewort-Ford, G., Harshey, J., Fasel, I., Marks, T., Smith, E., Sejnowski, T., Movellen, R.L.: Automatic analysis of spontaneous facial behaviour: A final project report, Technical Report INC-MPLab, UCSD (2001)Google Scholar
  3. 3.
    Tian, Y., Kanade, T., Cohn, J.: Recognizing action unit for facial expression analysis. IEEE Trans. PAMI 23(2), 97–114 (2001)Google Scholar
  4. 4.
    Cohn, J.F., Zlochower, A.J., Lian, J., Kanade, T.: Automated face analysis by feature point tracking has high concurrent validity with manual FACS coding. Psycophysiology 36, 35–43 (1999)CrossRefGoogle Scholar
  5. 5.
    Anderson, K., McOwan, P.W.: A real time automated system for the recognition of human facial expression. IEEE Trans. System, Man, and Cyb.–PartB: Cyb. 36(1), 96–105 (2006)CrossRefGoogle Scholar
  6. 6.
    Arulampalam, M.S., Maskell, S., Gordan, N., Clapp, T.: A Tutorial on particle filter for online nonlinear/non-Gaussian Bayesian tracking. IEEE Trans. Signal Processing 50(2), 172–188 (2002)CrossRefGoogle Scholar
  7. 7.
    Valstar, M.F., Patras, I., Pantic, M.: Facial action unit detection using probabilistic actively learned support vector machines on tracked facial Point data. In: CVPR 2005Google Scholar
  8. 8.
    Singh, A., Allen, P.: Image flow computation: An estimation-theoretic framework and a unified perspective. CVGIP: Image Understanding 56(2), 152–177 (1992)MATHCrossRefGoogle Scholar
  9. 9.
    Kanade, T., Cohn, J.F., Tian, Y.: Comprehensive database for facial expression analysis. In: Fourth International Conference on Automatic Facial and Gesture Recognition, Grenobel, France (2004)Google Scholar
  10. 10.
    Pantic, M., Valstar, M.F., Rademaker, R., Maat, L.: In: Proc. IEEE Int’l Conf. Multmedia and Expo (ICME 2005), Amsterdam, The Netherlands (July 2005)Google Scholar
  11. 11.
    Kumar, P.K., Das, S., Yegnanarayana, B.: One Dimension processing of the images. In: International Conference on Multimedia Processing Systems, Chennai, India (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Rakesh Tripathi
    • 1
  • R. Aravind
    • 1
  1. 1.Department of Electrical Engineering, Indian Institute of Technology Madras, Chennai-36India

Personalised recommendations