Emotion Recognition in Facial Image Sequences Using a Combination of AAM with FACS and DBN

  • Kwang-Eun Ko
  • Kwee-Bo Sim
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6424)


If we want to recognize the human’s emotion via the face-to-face interaction, first of all, we need to extract the emotional features from the facial image and recognize the emotional states. . Our facial emotional feature detection and extracting based on Active Appearance Models (AAM) with Ekman’s Facial Action Coding System (FACS). Our approach to facial emotion recognition lies in the dynamic and probabilistic framework based on Dynamic Bayesian Network (DBN). The active appearance model (AAM) is a well-known method that can represent a non-rigid object, such as face, facial expression. In this paper, our approach to facial feature extraction lies in the proposed feature extraction method based on combining AAM with Facial Action Units of Facial Action Coding System (FACS) for automatically modeling and extracting the facial emotional features. Also, we use the Dynamic Bayesian Networks (DBNs) for modeling and understanding the temporal phases of facial expressions in image sequences.


Facial Emotion Recognition Facial Feature Extraction Active Appearance Model Facial Action Coding System Dynamic Bayesian Network 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Zhang, Y., Ji, Q.: Active and Dynamic Information Fusion for Facial Expression Understanding from Image Sequences. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(5), 699–714 (2005)CrossRefGoogle Scholar
  2. 2.
    Ekman, P., Friesen, W.V.: The repertoire of nonverbal behavior: Categories, origins, usage, and coding. Semiotica 1, 49–98Google Scholar
  3. 3.
    DataFace Site: Facial Expressions, Emotions Expressions, Nonverbal Communication, Physiognomy,
  4. 4.
    Padgett, C., Cottrell, G.: Representing Face Images for Emotion Classification. In: Mozer, M., Jordan, M., Petsche, T. (eds.) Advances in Neural Information Processing Systems, vol. 9 (1997)Google Scholar
  5. 5.
    Lyons, M.J., Budynek, J., Akamatsu, S.: Automatic Classification of Single Facial Images. IEEE Trans. Pattern Analysis and Machine Intelligence 21(12), 1357–1362 (1999)CrossRefGoogle Scholar
  6. 6.
    Yi-Bin, S., Jian-Ming, Z., Jian-Hua, T., Geng-Tao, Z.: An improved facial feature localization method based on ASM. In: 7th International Conference on Computer-Aided Industrial Design and Conceptual Design, CAIDCD 2006 (2006)Google Scholar
  7. 7.
    Kobayasho, S., Hashimoto, S.: Automated feature extraction of face image and its applications. In: International Workshop on Robot and Human Communication, pp. 164–169Google Scholar
  8. 8.
    Liu, J., Udupa, J.K.: Oriented Active Shape Models. IEEE Transactions on Medical Imaging 28(4), 571–584 (2009)CrossRefGoogle Scholar
  9. 9.
    Cootes, T.F., Edwards, G.J., Taylor, C.J.: Active Appearance Model. IEEE Transactions on Pattern Analysis and Machine Intelligence 23(6), 681–685 (2001)CrossRefGoogle Scholar
  10. 10.
    Stegmann, M.B.: Analysis and Segmentation of Face Images using Point Annotations and Linear Subspace Techinques. informatics and mathematical modelling Lingby, Denmark: Tech. Univ. Denmark, IMM Tech. Rep. IMM-REP-2002-22 (August 2002)Google Scholar
  11. 11.
    Ekman, Friesen, W.: Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto (1978)Google Scholar
  12. 12.
  13. 13.
    Mita, T., Kaneko, T., Hori, O.: Joint Haar-like Features for Face Detection. In: Proceedings of the 10th IEEE International Conference on Computer Vision, October 17-21, vol. 2, pp. 1619–16261 (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Kwang-Eun Ko
    • 1
  • Kwee-Bo Sim
    • 1
  1. 1.School of Electrical and Electronics EngineeringChung-Ang UniversitySeoulKorea

Personalised recommendations