Skip to main content

Using 2D and 3D Face Representations to Generate Comprehensive Facial Electromyography Intensity Maps

  • Conference paper
  • First Online:
Advances in Visual Computing (ISVC 2023)

Abstract

Electromyography (EMG) is a method to measure muscle activity. Physicians also use EMG to study the function of facial muscles through intensity maps (IMs) to support diagnostics and research. However, many existing visualizations neglect anatomical structures and disregard the physical properties of EMG signals. The variance of facial structures between people complicates the generalization of IMs, which is crucial for their correct interpretation. In our work, we overcome these issues by introducing a pipeline to generate anatomically correct IMs for facial muscles. An IM generation algorithm based on a template model incorporates custom surface EMG schemes and combines them with a projection method to highlight the IMs on the patient’s face in 2D and 3D. We evaluate the generated and projected IMs based on their correct projection quality for six base emotions on several subjects. These visualizations deepen the understanding of muscle activity areas and indicate that a holistic view of the face could be necessary to understand facial muscle activity. Medical experts can use our approach to study the function of facial muscles and to support diagnostics and therapy.

Supported by Deutsche Forschungsgemeinschaft (DFG - German Research Foundation) project 427899908 BRIDGING THE GAP: MIMICS AND MUSCLES (DE 735/15-1 and GU 463/12-1).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    www.github.com/cvjena/electromyogram, www.github.com/cvjena/face-projection.

  2. 2.

    All shown individuals agreed to have their images published in terms with the GDPR.

  3. 3.

    We support LookingGlass Portrait (Looking Glass Factory Inc., New York, USA) natively in our pipeline.

References

  1. Barrett, L.F., Adolphs, R., Marsella, S., Martinez, A.M., Pollak, S.D.: Emotional expressions reconsidered: challenges to inferring emotion from human facial movements. Psychol. Sci. Public Interest 20(1), 1–68 (2019). https://doi.org/10.1177/1529100619832930

    Article  Google Scholar 

  2. Benitez-Quiroz, C.F., Srinivasan, R., Martinez, A.M.: Facial color is an efficient mechanism to visually transmit emotion. Proc. Natl. Acad. Sci. 115(14), 3581–3586 (2018). https://doi.org/10.1073/pnas.1716084115

    Article  Google Scholar 

  3. Büchner, T., Sickert, S., Volk, G.F., Anders, C., Guntinas-Lichius, O., Denzler, J.: Let’s get the FACS straight - reconstructing obstructed facial features. In: International Conference on Computer Vision Theory and Applications (VISAPP), pp. 727–736. SciTePress (2023). https://doi.org/10.5220/0011619900003417

  4. Cowen, A., Sauter, D., Tracy, J.L., Keltner, D.: Mapping the Passions: toward a high-dimensional taxonomy of emotional experience and expression. Psychol. Sci. Public Interest 20(1), 69–90 (2019). https://doi.org/10.1177/1529100619850176

    Article  Google Scholar 

  5. Dasgupta, A., Poco, J., Rogowitz, B., Han, K., Bertini, E., Silva, C.T.: The effect of color scales on climate scientists’ objective and subjective performance in spatial data analysis tasks. IEEE Trans. Visual Comput. Graph. 26(3), 1577–1591 (2020). https://doi.org/10.1109/TVCG.2018.2876539

    Article  Google Scholar 

  6. Ekman, P.: An argument for basic emotions. Cogn. Emot. 6(3–4), 169–200 (1992). https://doi.org/10.1080/02699939208411068

    Article  Google Scholar 

  7. Ekman, P., Friesen, W.: Facial action coding system: a technique for the measurement of facial movement. Palo Alto: Consult. Psychol. Press (1978). https://doi.org/10.1037/t27734-000

    Article  Google Scholar 

  8. Elfenbein, H.A., Ambady, N.: On the universality and cultural specificity of emotion recognition: a meta-analysis. Psychol. Bull. 128(2), 203–235 (2002). https://doi.org/10.1037/0033-2909.128.2.203

    Article  Google Scholar 

  9. Fasshauer, G.E.: Meshfree approximation methods with matlab: (With CD-ROM), Interdisciplinary Mathematical Sciences, vol. 6. World Scientific (2007). https://doi.org/10.1142/6437

  10. Feng, Y., Feng, H., Black, M.J., Bolkart, T.: Learning an animatable detailed 3D face model from in-the-wild images. ACM Trans. Graph. 40(4), 1–13 (2021). https://doi.org/10.1145/3450626.3459936

    Article  Google Scholar 

  11. Fridlund, A.J., Cacioppo, J.T.: Guidelines for human electromyographic research. Psychophysiology 23(5), 567–589 (1986). https://doi.org/10.1111/j.1469-8986.1986.tb00676.x

    Article  Google Scholar 

  12. Guo, J., Zhu, X., Yang, Y., Yang, F., Lei, Z., Li, S.Z.: Towards fast, accurate and stable 3d dense face alignment. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12364, pp. 152–168. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58529-7_10

    Chapter  Google Scholar 

  13. Kartynnik, Y., Ablavatski, A., Grishchenko, I., Grundmann, M.: Real-time facial surface geometry from monocular video on mobile GPUs. arXiv:1907.06724 (2019)

  14. Kazhdan, M., Bolitho, M., Hoppe, H.: Poisson surface reconstruction. The Eurographics Association (2006). https://doi.org/10.2312/SGP/SGP06/061-070

  15. Kloeckner, A., et al.: MeshPy (2022). https://doi.org/10.5281/zenodo.7296572

  16. Kuramoto, E., Yoshinaga, S., Nakao, H., Nemoto, S., Ishida, Y.: Characteristics of facial muscle activity during voluntary facial expressions: imaging analysis of facial expressions based on myogenic potential data. Neuropsychopharmacol. Rep. 39(3), 183–193 (2019). https://doi.org/10.1002/npr2.12059

    Article  Google Scholar 

  17. Loyo, M., McReynold, M., Mace, J.C., Cameron, M.: Protocol for randomized controlled trial of electric stimulation with high-volt twin peak versus placebo for facial functional recovery from acute Bell’s palsy in patients with poor prognostic factors. J. Rehabil. Assist. Technol. Eng. 7, 2055668320964142 (2020). https://doi.org/10.1177/2055668320964142

    Article  Google Scholar 

  18. Lugaresi, C., et al.: MediaPipe: a framework for building perception pipelines (2019). https://doi.org/10.48550/arXiv.1906.08172

  19. Mueller, N., Trentzsch, V., Grassme, R., Guntinas-Lichius, O., Volk, G.F., Anders, C.: High-resolution surface electromyographic activities of facial muscles during mimic movements in healthy adults: a prospective observational study. Front. Hum. Neurosci. 16, 1029415 (2022). https://doi.org/10.3389/fnhum.2022.1029415

    Article  Google Scholar 

  20. Paysan, P., Knothe, R., Amberg, B., Romdhani, S., Vetter, T.: A 3D face model for pose and illumination invariant face recognition. In: 2009 Sixth IEEE International Conference on Advanced Video and Signal Based Surveillance, pp. 296–301. IEEE (2009). https://doi.org/10.1109/AVSS.2009.58

  21. Ranjan, A., Bolkart, T., Sanyal, S., Black, M.J.: Generating 3D faces using convolutional mesh autoencoders. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11207, pp. 725–741. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01219-9_43

    Chapter  Google Scholar 

  22. Robertson, D.G.E., Caldwell, G.E., Hamill, J., Kamen, G., Whittlesey, S.N.: Research Methods in Biomechanics. Illinois, second edn., Human Kinetics, Champaign (2014)

    Google Scholar 

  23. Schaede, R.A., Volk, G.F., Modersohn, L., Barth, J.M., Denzler, J., Guntinas-Lichius, O.: Video instruction for synchronous video recording of mimic movement of patients with facial palsy. Laryngorhinootologie 96(12), 844–849 (2017). https://doi.org/10.1055/s-0043-101699

    Article  Google Scholar 

  24. Shewchuk, J.R.: Triangle: engineering a 2D quality mesh generator and Delaunay triangulator. In: Lin, M.C., Manocha, D. (eds.) WACG 1996. LNCS, vol. 1148, pp. 203–222. Springer, Heidelberg (1996). https://doi.org/10.1007/BFb0014497

    Chapter  Google Scholar 

  25. Taubin, G.: Curve and surface smoothing without shrinkage. In: Proceedings of IEEE International Conference on Computer Vision, pp. 852–857 (1995). https://doi.org/10.1109/ICCV.1995.466848

  26. Volk, G.F., Leier, C., Guntinas-lichius, O.: Correlation between electromyography and quantitative ultrasonography of facial muscles in patients with facial palsy. Muscle Nerve 53(5), 755–761 (2016)

    Article  Google Scholar 

  27. Vollmer, J., Mencl, R., Müller, H.: Improved Laplacian smoothing of noisy surface meshes. Comput. Graph. Forum 18(3), 131–138 (1999). https://doi.org/10.1111/1467-8659.00334

    Article  Google Scholar 

  28. Wahba, G.: Spline Models for Observational Data. CBMS-NSF Regional Conference Series in Applied Mathematics, Society for Industrial and Applied Mathematics (1990). https://doi.org/10.1137/1.9781611970128

  29. Zhu, X., Liu, X., Lei, Z., Li, S.Z.: Face alignment in full pose range: a 3d total solution. IEEE Trans. Pattern Anal. Mach. Intell. 41(1), 78–92 (2017)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tim Büchner .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Büchner, T., Sickert, S., Graßme, R., Anders, C., Guntinas-Lichius, O., Denzler, J. (2023). Using 2D and 3D Face Representations to Generate Comprehensive Facial Electromyography Intensity Maps. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2023. Lecture Notes in Computer Science, vol 14362. Springer, Cham. https://doi.org/10.1007/978-3-031-47966-3_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-47966-3_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-47965-6

  • Online ISBN: 978-3-031-47966-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics