Advertisement

Emotion Estimation using Geometric Features from Human Lower Mouth Portion

  • P. ShanthiEmail author
  • A. Vadivel
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 366)

Abstract

This paper presents approach for emotion estimation using geometrical features from the lower mouth portion. Basic geometric transformation features from lower face is extracted. A point based tracking method using Associative Recurrent Neural Network (ARNN) is developed whose input is most contributing features, identified from 65 dimensional data. The proposed approach effectiveness is realized on JAFFE and Yale data sets separately and collectively with good recognition rate for some basic emotions.

Keywords

Emotion Facial expression Geometric feature Feature selection Associative Recurrent Neural Network 

Notes

Acknowledgement

This work is supported by a research grant from the Indo-US 21st century knowledge initiative programme under Grant F. No/94-5/2013 (IC) dated 19-08-2013.

References

  1. 1.
    Levenson, R.W.: Autonomic nervous system differences among emotions. Psychol. Sci. 3, 23 –27 (1992).CrossRefGoogle Scholar
  2. 2.
    Witvliet, C.V.O., Vrana, S.R.: Psychophysiological responses as indices of affective dimensions. Psychophysiology, 32, 436 – 443 (1995)CrossRefGoogle Scholar
  3. 3.
    Friedenberg, J., Silverman, G.: Cognitive Science: An introduction to the study of mind. SAGE Publications, London (2006)Google Scholar
  4. 4.
    Ekman, P., Friesen, W.V.: Facial Action Coding System Investigator’s Guide, Consulting Psychologist Press, Palo Alto, CA, (1978)Google Scholar
  5. 5.
    Ekman, P., Friesen, W.V.: Constants across cultures in the face and emotion. J. Pers. Soc. Psychol., 17, 124–129 (1971)CrossRefGoogle Scholar
  6. 6.
    Cootes, T., Taylor, C.J.: Active shape models—smart snakes. In: British machine Vision Conference, pp. 266– 275, Springer-Verlag (1992)Google Scholar
  7. 7.
    Soladie, C., Stoiber, N., Seguier, R.: A new invariant representation of facial expressions: Definition and application to blended expression recognition. In: 19th IEEE International Conference on Image Processing (ICIP), pp. 2617 – 2620, Orlando, FL (2012)
  8. 8.
    Calder, A.J., Burton, M.A., Miller, P., Young, A.W., Akamatsu, S.: A principle component analysis of facial expression. Vision Research, 49, 1179-1208 (2001)CrossRefGoogle Scholar
  9. 9.
    Darwin, C.: The Expression of the Emotions in Man and Animals. J. Murray, London (1872)CrossRefGoogle Scholar
  10. 10.
    Zhang, L., Tjondronegoro, D.: Facial Expression Recognition Using Facial Movement Features. IEEE Transactions On Affective Computing, 2 (4) (2011).Google Scholar
  11. 11.
    Gu, W., Xiang, C., Venkatesh, Y.V., Huang, D., Lin, H.: Facial expression recognition using radial encoding of local Gabor features and classifier synthesis. Pattern Recognition 45, 80–9 (2012)CrossRefGoogle Scholar
  12. 12.
    Xie, X., ManLam, K.: Facial expression recognition based on shape and texture. Pattern Recognition, 42, 1003 – 1011 (2009)CrossRefGoogle Scholar
  13. 13.
    Kotsia, I., Pitas, I.: Facial Expression Recognition in Image Sequences Using Geometric Deformation Features and Support Vector Machines. IEEE Transactions On Image Processing, 16 (1) (2007)Google Scholar
  14. 14.
    Kotsia, I., Buciu, I., Pitas, I.: An analysis of facial expression recognition under partial facial image occlusion. Image and Vision Computing, 26, 1052–1067 (2008).CrossRefGoogle Scholar
  15. 15.
    Valstar, M.F., Pantic, M.: Fully Automatic Recognition of the Temporal Phases of Facial Actions. IEEE Transactions On Systems, Man, And Cybernetics—PART B: CYBERNETICS, 42 (1) (2012)Google Scholar
  16. 16.
    Zafeiriou, S., Pitas, I.: Discriminant Graph Structures for Facial Expression Recognition. IEEE Transactions On Multimedia, 10 (8), 1528-1540 (2008)CrossRefGoogle Scholar
  17. 17.
    Song, M., Tao, D., Liu, Z., Li, X., Zhou, M.: Image Ratio Features for Facial Expression Recognition Application. IEEE Transactions On Systems, Man, And Cybernetics—Part B: Cybernetics, 40 (3), 779-788 (2010)Google Scholar
  18. 18.
    Li, Y., Zhao, Y., Ji, Q.: Simultaneous Facial Feature Tracking and Facial Expression Recognition. IEEE Transaction on Image Processing, 22 (7), 2559-2570 (2013)CrossRefGoogle Scholar
  19. 19.
    Khana, R.A., Meyer, A., Konik, H., Bouakaz, S.: Framework for reliable, real time facial expression recognition for low resolution images. Pattern Recognition Letters 34, 1159–1168 (2013)CrossRefGoogle Scholar
  20. 20.
    Tariq, U., Lin, K.H., Li, Z., Zhou, X., Wang, Z., Le, V., Huang, T.S., Lv, X., Han, T.X.: Recognizing Emotions From an Ensemble of Features. IEEE Transactions On Systems, Man, And Cybernetics—Part B: Cybernetics, 42 (4) (2012)Google Scholar
  21. 21.
    Geetha, A., Ramalingam, V., Palanivel, S., Palaniappan, B.: Facial expression recognition – A real time approach. Expert Systems with Applications 36, 303–308 (2009)CrossRefGoogle Scholar
  22. 22.
    Ji, Y., Idrissi, K.: Automatic facial expression recognition based on spatiotemporal descriptors. Pattern Recognition Letters, 33, 1373–1380 (2012)CrossRefGoogle Scholar
  23. 23.
    Danisman, T., Bilasco, I.M., Martinet, J., Djeraba, C.: Intelligent pixels of interest selection with application to facial expression recognition using multilayer perceptron. Signal Processing, 93 (6), 1547–1556 (2013)CrossRefGoogle Scholar
  24. 24.
    Dornaika, F., Lazkano, E., Sierra, B.: Improving dynamic facial expression recognition with feature subset selection. Pattern Recognition Letters 32, 740–748 (2011)CrossRefGoogle Scholar
  25. 25.
    Maronidis, A., Bolis, D., Tefas, A., Pitas, I.: Improving subspace learning for facial expression recognition using person dependent and geometrically enriched training sets. Neural Networks 24, 814–823 (2011)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Cognitive Science Research Group, Department of Computer ApplicationsNational Institute of TechnologyTiruchirappalliIndia

Personalised recommendations