Advertisement

Virtual EMG via Facial Video Analysis

  • Giuseppe Boccignone
  • Vittorio Cuculo
  • Giuliano Grossi
  • Raffaella Lanzarotti
  • Raffaella Migliaccio
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10484)

Abstract

In this note, we address the problem of simulating electromyographic signals arising from muscles involved in facial expressions - markedly those conveying affective information -, by relying solely on facial landmarks detected on video sequences. We propose a method that uses the framework of Gaussian Process regression to predict the facial electromyographic signal from videos where people display non-posed affective expressions. To such end, experiments have been conducted on the OPEN EmoRec II multimodal corpus.

Notes

Acknowledgments

This research was carried out as part of the project “Interpreting emotions: a computational tool integrating facial expressions and biosignals based shape analysis and bayesian networks”, supported by the Italian Government, managed by MIUR, financed by the Future in Research Fund.

References

  1. 1.
    Adamo, A., Grossi, G., Lanzarotti, R.: Local features and sparse representation for face recognition with partial occlusions. IEEE, September 2013Google Scholar
  2. 2.
    Adamo, A., Grossi, G., Lanzarotti, R., Lin, J.: Robust face recognition using sparse representation in LDA space. Mach. Vis. Appl. 26(6), 837–847 (2015)CrossRefGoogle Scholar
  3. 3.
    Aharon, M., Elad, M., Bruckstein, A.: K-SVD: an algorithm for designing overcomplete dictionaries for sparse representation. IEEE Trans. Sig. process. 54(11), 4311–4322 (2006)CrossRefGoogle Scholar
  4. 4.
    Anderson, D.J., Adolphs, R.: A framework for studying emotions across species. Cell 157(1), 187–200 (2014)CrossRefGoogle Scholar
  5. 5.
    Barzilay, O., Wolf, A.: A fast implementation for EMG signal linear envelope computation. J. Electromyogr. Kinesiol. 21(4), 678–682 (2011)CrossRefGoogle Scholar
  6. 6.
    van Boxtel, A.: Optimal signal bandwidth for the recording of surface EMG of facial, jaw, oral, and neck muscles. Psychophysiology 38, 22–34 (2001)CrossRefGoogle Scholar
  7. 7.
    Calvo, R., D’Mello, S.: Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1(1), 18–37 (2010)CrossRefGoogle Scholar
  8. 8.
    Cuculo, V., Lanzarotti, R., Boccignone, G.: Using sparse coding for landmark localization in facial expressions. In: 5th European Workshop on Visual Information Processing (EUVIP), pp. 1–6, December 2014Google Scholar
  9. 9.
    Dalgleish, T., Dunn, B., Mobbs, D.: Affective neuroscience: past, present, and future. Emot. Rev. 1(4), 355–368 (2009)CrossRefGoogle Scholar
  10. 10.
    Damasio, A.R.: The Feeling of What Happens: Body and Emotion in the Making of Consciousness. Houghton Mifflin Harcourt, Boston (1999)Google Scholar
  11. 11.
    Gallese, V.: The ‘shared manifold’ hypothesis. From mirror neurons to empathy. J. Conscious. Stud. 8(5–7), 33–50 (2001)Google Scholar
  12. 12.
    Gallese, V.: The manifold nature of interpersonal relations: the quest for a common mechanism. Philos. Trans. R. Soc. Lond. Ser. B: Biol. Sci. 358(1431), 517–528 (2003)CrossRefGoogle Scholar
  13. 13.
    Goldman, A.I., Sripada, C.S.: Simulationist models of face-based emotion recognition. Cognition 94(3), 193–213 (2005)CrossRefGoogle Scholar
  14. 14.
    Grossi, G., Lanzarotti, R., Lin, J.: Robust face recognition providing the identity and its reliability degree combining sparse representation and multiple features. Int. J. Pattern Recogn. Artif. Intell. 30(10) (2016)Google Scholar
  15. 15.
    Grossi, G., Lanzarotti, R., Lin, J.: Orthogonal procrustes analysis for dictionary learning in sparse linear representation. PLoS One 12 (2017)Google Scholar
  16. 16.
    Hildebrandt, A., Recio, G., Sommer, W., Wilhelm, O., Ku, J.: Facial EMG responses to emotional expressions are related to emotion perception ability. PLoS One 9(1) (2014)Google Scholar
  17. 17.
    Lang, P.J., Bradley, M.M., Cuthbert, B.N.: International affective picture system (IAPS): affective ratings of pictures and instruction manual. Technical Report A-8, The Center for Research in Psychophysiology, University of Florida, Gainesville, FL (2008)Google Scholar
  18. 18.
    Larsen, J., Norris, C., Cacioppo, J.: Effects of positive and negative affect on electromyography activity over zygomaticus major and corrugator supercilii. Psychophysiology 40, 776–785 (2003)CrossRefGoogle Scholar
  19. 19.
    Lu, G., Brittain, J.S., Holland, P., Yianni, J., Green, A.L., Stein, J.F., Aziz, T.Z., Wang, S.: Removing ECG noise from surface EMG signals using adaptive filtering. Neurosci. Lett. 462, 14–19 (2009)CrossRefGoogle Scholar
  20. 20.
    Myers, L., Lowery, M., O’Malley, M., Vaughan, C., Heneghan, C., Gibson, A.S.C., Harley, Y., Sreenivasan, R.: Rectification and non-linear pre-processing of EMG signals for cortico-muscular analysis. J. Neurosci. Methods 124(2), 157–165 (2003)CrossRefGoogle Scholar
  21. 21.
    Olshausen, B.A., Field, D.J.: Natural image statistics and efficient coding. Netw.: Comput. Neural Syst. 7(2), 333–339 (1996)CrossRefGoogle Scholar
  22. 22.
    Picard, R.W.: Affective Computing. MIT press, Cambridge (2000)Google Scholar
  23. 23.
    Poh, M.Z., McDuff, D.J., Picard, R.W.: Advancements in noncontact, multiparameter physiological measurements using a webcam. IEEE Trans. Biomed. Eng. 58(1), 7–11 (2011)CrossRefGoogle Scholar
  24. 24.
    Rasmussen, C.E., Williams, C.K.: Gaussian Processes for Machine Learning. The MIT Press, Cambridge (2006)zbMATHGoogle Scholar
  25. 25.
    Rukavina, S., Gruss, S., Walter, S., Hoffmann, H., Traue, H.C.: OPEN EmoRec II - a multimodal corpus of human-computer interaction. Int. J. Comput. Electr. Autom. Control Inf. Eng. 9(5), 1181–1187 (2015)Google Scholar
  26. 26.
    Sun, Y., Thakor, N.: Photoplethysmography revisited: from contact to noncontact, from point to imaging. IEEE Trans. Biomed. Eng. 63(3), 463–477 (2016)CrossRefGoogle Scholar
  27. 27.
    Tassinary, L.G., Cacioppo, J.T., Vanman, E.J.: The skeletomotor system: surface electromyography. In: Cacioppo, J.T., Tassinary, L.G., Berntson, G. (eds.) Handbook of Psychophysiology (Chap. 12), pp. 267–300. Cambridge University Press, Cambridge (2012)Google Scholar
  28. 28.
    Vitale, J., Williams, M.A., Johnston, B., Boccignone, G.: Affective facial expression processing via simulation: a probabilistic model. Biolog. Inspired Cogn. Archit. J. 10, 30–41 (2014)Google Scholar
  29. 29.
    Wang, S., Ji, Q.: Video affective content analysis: a survey of state-of-the-art methods. IEEE Trans. Affect. Comput. 6(4), 410–430 (2015)CrossRefGoogle Scholar
  30. 30.
    Wu, H.Y., Rubinstein, M., Shih, E., Guttag, J., Durand, F., Freeman, W.: Eulerian video magnification for revealing subtle changes in the world. ACM Trans. Graph. (TOG) 31(4), 65 (2012)CrossRefGoogle Scholar
  31. 31.
    Zhu, X., Ramanan, D.: Face detection, pose estimation, and landmark localization in the wild. In: Proceedings of IEEE CVPR, pp. 2879–2886 (2012)Google Scholar
  32. 32.
    Zschorlich, V.R.: Digital filtering of EMG-signals. Electromyogr. Clin. Neurophysiol. 29(April), 81–86 (1989)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Giuseppe Boccignone
    • 1
  • Vittorio Cuculo
    • 1
    • 2
  • Giuliano Grossi
    • 1
  • Raffaella Lanzarotti
    • 1
  • Raffaella Migliaccio
    • 1
  1. 1.PHuSe Lab - Dipartimento di InformaticaUniversità degli Studi di MilanoMilanItaly
  2. 2.Dipartimento di MatematicaUniversità degli Studi di MilanoMilanItaly

Personalised recommendations