Advertisement

From Motion to Emotion Prediction: A Hidden Biometrics Approach

  • Fawzi Rida
  • Liz Rincon Ardila
  • Luis Enrique Coronado
  • Amine Nait-ali
  • Gentiane VentureEmail author
Chapter
Part of the Series in BioEngineering book series (SERBIOENG)

Abstract

In this chapter it will be discussed the capability of using motion recognition in order to predict the human emotion. Considered as a behavioral hidden biometrics approach, a specific system has been developed for this purpose wherein, several Machine-Learning approaches are considered, such as SVM, RF, MLP and KNN for classification and SVR, RFR, MLPR and KNNR for regression. The study highlights promising results in comparison to the state of the art.

References

  1. 1.
    Russell, J.A., Mehrabian, A.: Evidence for a three-factor theory of emotions. J. Res. Pers. 11(3), 273–294 (1977)CrossRefGoogle Scholar
  2. 2.
    Luengo, I., Navas, E., Hernáez, I.: Feature analysis and evaluation for automatic emotion identification in speech. IEEE Trans. Multimed. 12(6), 490–501 (2010)CrossRefGoogle Scholar
  3. 3.
    Baron, R.A., Branscombe, N.R., Mynhardt, J.C.: Social Psychology. Pearson (2014)Google Scholar
  4. 4.
    Liebal, K., Carpenter, M., Tomasello, M.: Young children’s understanding of markedness in non-verbal communication. J. Child Lang. 38(4), 888–903 (2011)CrossRefGoogle Scholar
  5. 5.
    Aloui K., Nait-Ali, A., Saber, N.M.: 2011 IEEE Workshop on Computational Intelligence in Biometrics and Identity Management (CIBIM), pp. 91–95Google Scholar
  6. 6.
    Kabbara, Y., Shahin, A., Nait-Ali, A., Khalil, M.: An automatic algorithm for human identification using hand X-ray images. In: 2013 2nd International Conference on Advances in Biomedical Engineering (ICABME), pp. 167–170 (2013)Google Scholar
  7. 7.
    Kabbara, Y., Nait-Ali, A., Shahin, A., Khalil, M.: Hidden Biometric Identification/Authentication based on Phalanx Selection from Hand XRay Images with Safety considerations. In: The fifth International Conference on Image Processing Theory, Tools and Applications, Orleans (2015)Google Scholar
  8. 8.
    Noroozi, F., et al.: Survey on emotional body gesture recognition. arXiv preprint arXiv:1801.07481 (2018)
  9. 9.
    Ekman, P.: Psychol. Rev. 99(3), 550–553 (1992)Google Scholar
  10. 10.
    Plutchik, R.: The nature of emotions: human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am. Sci. 89(4), 344–350 (2001)CrossRefGoogle Scholar
  11. 11.
    Dornaika, F., Moujahid, A., Raducanu, B.: Facial expression recognition using tracked facial actions: classifier performance analysis. Eng. Appl. Artif. Intell. 26(1), 467–477 (2013)CrossRefGoogle Scholar
  12. 12.
    Raheja, J.L., Kumar, U.: Human Facial Expression Detection from Detected in Captured Image Using Back Propagation Neural Network (2010)Google Scholar
  13. 13.
    Su, M.-C., Hsieh, Y., Huang, D.-Y.: A simple approach to facial expression recognition. In: Proceeding WSEAS 2007 (2007)Google Scholar
  14. 14.
    Zhang, L., Tjondronegoro, D.: Facial expression recognition using facial movement features. IEEE Trans. Affect. Comput. 2(4), 219–229 (2011)CrossRefGoogle Scholar
  15. 15.
    Gamage, K.W., Dang, T., Sethu, V., Epps, J.: Speech-Based Continuous Emotion Prediction by Learning Perception Responses Related to Salient Events: A Study Based on Vocal Affect Bursts and Cross-Cultural AffectGoogle Scholar
  16. 16.
    Han, J., Zhang, Z., Ringeval, F., Schuller, B.: Prediction-Based Learning for Continuous Emotion Recognition in SpeechGoogle Scholar
  17. 17.
    Baveye, Y., Dellandréa, E., Chamaret, C., Chen, L.: Deep Learning vs. Kernel Methods: Performance for Emotion Prediction in VideosGoogle Scholar
  18. 18.
    Gunes, H., Piccardi, M.: Automatic temporal segment detection and affect recognition from face and body display. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 39(1), 64–84 (2009)CrossRefGoogle Scholar
  19. 19.
    Nicolaou, M.A., Gunes, H., Pantic, M.: Continuous prediction of spontaneous affect from multiple cues and modalities in valence-arousal space. IEEE Trans. Affect. Comput. 2(2), 92–105 (2011)CrossRefGoogle Scholar
  20. 20.
    Friesen, E., Ekman, P.: Facial action coding system: a technique for the measurement of facial movement. In: Palo Alto (1978)Google Scholar
  21. 21.
    Hjortsjö, C.-H.: Man’s face and mimic language. Studen litteratur (1969)Google Scholar
  22. 22.
    Dornaika, F., Davoine, F.: Simultaneous facial action tracking and expression recognition in the presence of head motion. Int. J. Comput. Vis. 76(3), 257–281 (2008)CrossRefGoogle Scholar
  23. 23.
    Busso, C., et al.: Iterative feature normalization scheme for automatic emotion detection from speech. IEEE Trans. Affect. Comput. 4(4), 386–397 (2013)CrossRefGoogle Scholar
  24. 24.
    Mao, Q., et al.: Learning salient features for speech emotion recognition using convolutional neural networks. IEEE Trans. Multimed. 16(8), 2203–2213 (2014)CrossRefGoogle Scholar
  25. 25.
    Gangeh, M.J., et al.: Multiview supervised dictionary learning in speech emotion recognition. IEEE/ACM Trans. Audio Speech Lang. Process. (TASLP) 22(6), 1056–1068CrossRefGoogle Scholar
  26. 26.
    Lu, K., Jia, Y.: Audio-visual emotion recognition using boltzmann zippers. In: 2012 19th IEEE International Conference on Image Processing (ICIP), pp. 2589–2592. IEEE (2012)Google Scholar
  27. 27.
    Rozgíc, V, Vitaladevuni, S.N., Prasad, R.: Robust EEG emotion classification using segment level decision fusion. In: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 1286–1290. IEEE (2013)Google Scholar
  28. 28.
    Lakens, D.: Using a smartphone to measure heart rate changes during relived happiness and anger. IEEE Trans. Affect. Comput. 4(2), 238–241 (2013)CrossRefGoogle Scholar
  29. 29.
    Vinola, C., Vimaladevi, K.: A survey on human emotion recognition approaches, databases and applications. ELCVIA Electr. Lett. Comput. Vis. Image Anal. 14(2), 24–44 (2015)CrossRefGoogle Scholar
  30. 30.
    MIT Technology Review. Sensor detects emotions through the skin. https://www.technologyreview.com/s/421316/sensor-detects-emotions-throughthe-skin/. Last accessed on Web. 8 Aug. 2018
  31. 31.
    Kapur, A., et al.: Gesture-based affective computing on motion capture data. In: International Conference on Affective Computing and Intelligent Interaction, pp. 1–7. Springer (2005)Google Scholar
  32. 32.
    Ekman, P., Friesen, W.V.: Detecting deception from the body or face. J. Pers. Soc. Psychol. 29(3), 288 (1974)CrossRefGoogle Scholar
  33. 33.
    Aviezer, H., Trope, Y., Todorov, A.: Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science 338(6111), 1225–1229 (2012)CrossRefGoogle Scholar
  34. 34.
    Ravindra De Silva, P., et al.: Towards recognizing emotion with affective dimensions through body gestures. In: 7th International Conference on Automatic Face and Gesture Recognition, 2006. FGR 2006, pp. 269–274. IEEE (2006)Google Scholar
  35. 35.
    Shan, C., Gong, S., McOwan, P.W.: Beyond facial expressions: learning human emotion from body gestures. In: BMVC, pp. 1–10 (2007)Google Scholar
  36. 36.
    Ma, Y., Paterson, H.M., Pollick, F.E.: A motion capture library for the study of identity, gender, and emotion perception from biological motion. Behav. Res. Methods 38(1), 134–141 (2006)CrossRefGoogle Scholar
  37. 37.
    Venture, G., et al.: Recognizing emotions conveyed by human gait. Int. J. Soc. Robot. 6(4), 621–632 (2014)CrossRefGoogle Scholar
  38. 38.
    Karg, M., Kuhnlenz, K., Buss, M.: Recognition of affect based on gait patterns. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 40(4), 1050–1061 (2010)CrossRefGoogle Scholar
  39. 39.
    Bernhardt, D., Robinson, P.: Detecting affect from non-stylised body motions. In: International Conference on Affective Computing and Intelligent Interaction, pp. 59–70. Springer (2007)Google Scholar
  40. 40.
    Pollick, F.E., et al.: Estimating the efficiency of recognizing gender and affect from biological motion. Vis. Res. 42(20), 2345–2355 (2002)CrossRefGoogle Scholar
  41. 41.
    Camurri, A., Mazzarino, B., Volpe, G.: Expressive interfaces. Cognit. Technol. Work 6(1), 15–22 (2004)CrossRefGoogle Scholar
  42. 42.
    Park, H., et al.: Emotion recognition from dance image sequences using contour approximation. In: Joint IAPR InternationalWorkshops on Statistical Techniques in Pattern Recognition (SPR) and Structural and Syntactic Pattern Recognition (SSPR), pp. 547–555. Springer (2004)Google Scholar
  43. 43.
    Camurri, A., Lagerlöf, I., Volpe, G.: Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. Int. J. Hum Comput Stud. 59(1–2), 213–225 (2003)CrossRefGoogle Scholar
  44. 44.
    Camurri, A., et al.: Multimodal analysis of expressive gesture in music and dance performances. In: International Gesture Workshop, pp. 20–39. Springer (2003)Google Scholar
  45. 45.
    Castellano, G., Villalba, S.D., Camurri, A.: Recognising human emotions from body movement and gesture dynamics. In: International Conference on Affective Computing and Intelligent Interaction, pp. 71–82. Springer, Berlin (2007)Google Scholar
  46. 46.
    Tracy,J.L., Randles, D.: Four models of basic emotions: a review of Ekman and Cordaro, Izard, Levenson, and Panksepp and Watt. Emot. Rev. 3(4), 397–405 (2011)CrossRefGoogle Scholar
  47. 47.
    Ekman, P., Friesen, W.V.: A new pan-cultural facial expression of emotion. Motiv. Emot. 10(2), 159–168 (1986)CrossRefGoogle Scholar
  48. 48.
    Ratneshwar, S., Mick, D.G., Huffman, C.: Introduction: the “why” of consumption. In: The Why of Consumption, pp. 21–28. Routledge (2003)Google Scholar
  49. 49.
    Havlena, W.J., Holbrook, M.B.: The varieties of consumption experience: comparing two typologies of emotion in consumer behavior. J. Consum. Res. 13(3), 394–404 (1986)CrossRefGoogle Scholar
  50. 50.
    Mikels, J.A., et al.: Emotional category data on images from the international affective picture system. Behav. Res. Methods 37(4), 626–630 (2005)CrossRefGoogle Scholar
  51. 51.
    Barrett, L.F.: Solving the emotion paradox: categorization and the experience of emotion. Pers. Soc. Psychol. Rev. 10(1), 20–46 (2006)CrossRefGoogle Scholar
  52. 52.
    Bradley, M.M., Lang, P.J.: Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav Ther. Exp. Psychiatr. 25(1), 49–59 (1994)CrossRefGoogle Scholar
  53. 53.
    Clark, R.A., et al.: Validity and reliability of the Nintendo Wii Balance Board for assessment of standing balance. Gait Posture 31(3), 307–310 (2010)CrossRefGoogle Scholar
  54. 54.
    Shih, C.-H., Shih, C.-T., Chu, C.-L.: Assisting people with multiple disabilities actively correct abnormal standing posture with a Nintendo Wii balance board through controlling environmental stimulation. Res. Dev. Disabil. 31(4), 936–942 (2010)CrossRefGoogle Scholar
  55. 55.
    Venture, G., Yabuki, T., Kinase, Y., Berthoz, A., Abe, N.: Using Dynamics to Recognize Human Motion (2016)Google Scholar
  56. 56.
    Yabuki, T., Venture, G.: Motion classification and recognition using only contact force. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4251–4256. Hamburg, Germany (2015)Google Scholar
  57. 57.
    Web Site: Wii Balance Board. http://www.mdpi.com/1424-8220/14/10/18244. Last accessed on Sun. 12 Aug 2018
  58. 58.
    Babcock, B., Datar, M., Motwani, R.: Sampling from a moving window over streaming data. In: Proceedings of the Thirteenth Annual ACMSIAM Symposium on Discrete Algorithms. Society for Industrial and Applied Mathematics, 17 Aug 2018, pp. 633–634 (2002)Google Scholar
  59. 59.
    Scikit-learn: Scikit-Learn—Machine Learning in Python. http://scikit-learn.org/. Last accessed on Sat. 18 Aug 2018
  60. 60.
    Gong, L., et al.: Recognizing affect from non-stylized body motion using shape of Gaussian descriptors. In: Proceedings of the 2010 ACM Symposium on Applied Computing, pp. 1203–1206. ACM (2010)Google Scholar
  61. 61.
    Izui, T., et al.: Expressing emotions using gait of humanoid robot. In: 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 241–245. IEEE (2015)Google Scholar
  62. 62.
    Karg, M., et al.: Body movements for affective expression: a survey of automatic recognition and generation. IEEE Trans. Affect. Comput. 4(4), 341–359 (2013)CrossRefGoogle Scholar
  63. 63.
    Mayer, J.D., DiPaolo, M., Salovey, P.: Perceiving affective content in ambiguous visual stimuli: A component of emotional intelligence. J. Pers. Assess. 54(3–4), 772–781 (1990)CrossRefGoogle Scholar
  64. 64.
    Mehrabian, A.: Nonverbal Communication. Routledge, 2017Google Scholar
  65. 65.
    Mehrabian, A.: Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr. Psychol. 14(4), 261–292 (1996)MathSciNetCrossRefGoogle Scholar
  66. 66.
    Ng, A.: Online Lecture Notes-Machine Learning. Stanford UniversityGoogle Scholar
  67. 67.
    Oldfield, R.C.: The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia 9(1), 97–113 (1971)CrossRefGoogle Scholar
  68. 69.
    Thorndike, R.L., Stein, S.: An evaluation of the attempts to measure social intelligence. Psychol. Bull. 34(5), p. 275 (1937)CrossRefGoogle Scholar
  69. 70.
    Walk, R.D., Walters, K.L.: Perception of the Smile and other Emotions of the Body and Face at Different Distances (1988)Google Scholar
  70. 71.
    ZeroMQ: Zeromq—distributed messaging. http://zeromq.org/. Last accessed

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  • Fawzi Rida
    • 1
  • Liz Rincon Ardila
    • 2
  • Luis Enrique Coronado
    • 2
  • Amine Nait-ali
    • 3
  • Gentiane Venture
    • 2
    Email author
  1. 1.Soft ConsultingParisFrance
  2. 2.GV Lab, Tokyo University of Agriculture and TechnologyTokyoJapan
  3. 3.Université Paris-Est, LISSI, UPECVitry sur SeineFrance

Personalised recommendations