Advertisement

Visualizing Facial Expression Features of Pain and Emotion Data

  • Jan Sellner
  • Patrick Thiam
  • Friedhelm SchwenkerEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11377)

Abstract

Pain and emotions reveal important information about the state of a person and are often expressed via the face. Most of the time, systems which analyse these states consider only one type of expression. For pain, the medical context is a common scenario for automatic monitoring systems and it is not unlikely that emotions occur there as well. Hence, these systems should not confuse both types of expressions. To facilitate advances in this field, we use video data from the BioVid Heat Pain Database, extract Action Unit (AU) intensity features and conduct first analyses by creating several feature visualizations. We show that the AU usage pattern is more distinct for the pain, amusement and disgust classes than for the sadness, fear and anger classes. For the former, we present additional visualizations which reveal a clearer picture of the typically used AUs per expression by highlighting dependencies between AUs (joint usages). Finally, we show that the feature discrimination quality varies heavily across the 64 tested subjects.

Keywords

Pain Emotions Facial expression Feature visualization FACS Action Unit 

References

  1. 1.
    The biovid heat pain database. http://www.iikt.ovgu.de/BioVid.html
  2. 2.
    Baltrušaitis, T., Zadeh, A., Lim, Y.C., Morency, L.P.: Openface 2.0: facial behavior analysis toolkit. In: 2018 13th IEEE International Conference on Automatic Face Gesture Recognition (FG 2018), pp. 59–66 (2018).  https://doi.org/10.1109/FG.2018.00019
  3. 3.
    Cootes, T.F., Taylor, C.J., Cooper, D.H., Graham, J.R.: Active shape models-their training and application. Comput. Vis. Image Underst. 61(1), 38–59.  https://doi.org/10.1006/cviu.1995.1004
  4. 4.
    Craig, K.D., Prkachin, K.M., Grunau, R.E.: The facial expression of pain. In: Turk, D.C., Melzack, R. (eds.) Handbook of Pain Assessment, pp. 153–169. Guilford Press, New York (2001)Google Scholar
  5. 5.
    De la Torre, F., Cohn, J.F.: Facial expression analysis. In: Moeslund, T.B., Hilton, A., Krüger, V., Sigal, L. (eds.) Visual Analysis of Humans: Looking at People, pp. 377–409. Springer, London (2001).  https://doi.org/10.1007/978-0-85729-997-0_19CrossRefGoogle Scholar
  6. 6.
    Ekman, P.: Emotions revealed. 2nd owl books edn. Owl Books, New York. http://www.loc.gov/catdir/enhancements/fy0733/2007277266-b.html
  7. 7.
    Ekman, P., Friesen, W.V., Hager, J.C.: Facial Action Coding System. Research Nexus, Salt Lake City (2002)Google Scholar
  8. 8.
    Ekman, P., Friesen, W.V., O’Sullivan, M., Scherer, K.: Relative importance of face, body, and speech in judgments of personality and affect. J. Pers. Soc. Psychol. 38(2).  https://doi.org/10.1037/0022-3514.38.2.270.
  9. 9.
    Friesen, W.V., Ekman, P.: EMFACS-7, CaliforniaGoogle Scholar
  10. 10.
    Hadjistavropoulos, T., Craig, K.D.: Pain: Psychological Perspectives. 1st edn. Lawrence Erlbaum Associates, Publishers (2004)Google Scholar
  11. 11.
    Hale, C.J., Hadjistavropoulos, T.: Emotional components of pain. Pain Res. Manag. 2(4).  https://doi.org/10.1155/1997/283582
  12. 12.
    Hammal, Z., Kunz, M.: Pain monitoring: a dynamic and context-sensitive system. Pattern Recognit. 45(4), 1265–1280 (2012).  https://doi.org/10.1016/j.patcog.2011.09.014, http://www.sciencedirect.com/science/article/pii/S0031320311003931
  13. 13.
    Hauskrecht, M.: Ensamble methods mixtures of experts. https://people.cs.pitt.edu/~milos/courses/cs2750-Spring04/lectures/class22.pdf
  14. 14.
    LeResche, L., Dworkin, S.F.: Facial expressions of pain and emotions in chronic TMD patients. Pain 35(1), 71–78.  https://doi.org/10.1016/0304-3959(88)90278-3, http://www.sciencedirect.com/science/article/pii/0304395988902783
  15. 15.
    Mehrabian, A.: Communication without words. Psychol. Today 2(4), 53–56 (1968)Google Scholar
  16. 16.
    Niese, R., Al-Hamadi, A., Panning, A., Brammen, D.G., Ebmeyer, U., Michaelis, B.: Towards pain recognition in post-operative phases using 3D-based features from video and support vector machines. JDCTA 3, 21–33 (2009)Google Scholar
  17. 17.
    Prkachin, K.M.: The consistency of facial expressions of pain: a comparison across modalities. In: Ekman, P., Rosenberg, E.L. (eds.) What the Face Reveals, pp. 181–198. Oxford University Press.  https://doi.org/10.1093/acprof:oso/9780195179644.003.0009
  18. 18.
    Sariyanidi, E., Gunes, H., Cavallaro, A.: Automatic analysis of facial affect. IEEE Trans. Pattern Anal. Mach. Intell. 37(6), 1113–1133.  https://doi.org/10.1109/TPAMI.2014.2366127
  19. 19.
    Simon, D., Craig, K.D., Gosselin, F., Belin, P., Rainville, P.: Recognition and discrimination of prototypical dynamic expressions of pain and emotions. PAIN® 135(1), 55–64 (2008).  https://doi.org/10.1016/j.pain.2007.05.008, http://www.sciencedirect.com/science/article/pii/S0304395907002485
  20. 20.
    Sumathi, C.: Automatic facial expression analysis a survey. Int. J. Comput. Sci. Eng. Surv. 3(6), 47–59.  https://doi.org/10.5121/ijcses.2012.3604
  21. 21.
    Thiam, P., Kächele, M., Schwenker, F., Palm, G.: Ensembles of support vector data description for active learning based annotation of affective corpora. In: 2015 IEEE Symposium Series on Computational Intelligence, pp. 1801–1807, December 2015Google Scholar
  22. 22.
    Thiam, P., et al.: Multi-modal pain intensity recognition based on the senseemotion database. IEEE Trans. Affect. Comput. (2019).  https://doi.org/10.1109/TAFFC.2019.2892090
  23. 23.
    Thiam, P., Kessler, V., Schwenker, F.: Hierarchical combination of video features for personalised pain level recognition. In: Proceedings of the 25th European Symposium of Artificial Neural Networks, Computational Intelligence and Machine Learning, pp. 465–470 (2017)Google Scholar
  24. 24.
    Thiam, P., Kessler, V., Walter, S., Palm, G., Schwenker, F.: Audio-visual recognition of pain intensity. In: Schwenker, F., Scherer, S. (eds.) MPRSS 2016. LNCS (LNAI), vol. 10183, pp. 110–126. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-59259-6_10CrossRefGoogle Scholar
  25. 25.
    Thiam, P., Meudt, S., Kächele, M., Palm, G., Schwenker, F.: Detection of emotional events utilizing support vector methods in an active learning HCI scenario. In: Proceedings of the 2014 Workshop on Emotion Representation and Modelling in Human-Computer-Interaction-Systems, ERM4HCI 2014, pp. 31–36. ACM, New York (2014)Google Scholar
  26. 26.
    Thiam, P., Meudt, S., Palm, G., Schwenker, F.: A temporal dependency based multi-modal active learning approach for audiovaudio event detection. Neural Process. Lett. 48(2), 709–732 (2018)CrossRefGoogle Scholar
  27. 27.
    Walter, S., et al.: The biovid heat pain database data for the advancement and systematic validation of an automated pain recognition system. In: 2013 IEEE International Conference on Cybernetics (CYBCO), pp. 128–131. IEEE.  https://doi.org/10.1109/CYBConf.2013.6617456
  28. 28.
    Werner, P., Al-Hamadi, A., Limbrecht-Ecklundt, K., Walter, S., Gruss, S., Traue, H.C.: Automatic pain assessment with facial activity descriptors. IEEE Trans. Affect. Comput. 8(3), 286–299.  https://doi.org/10.1109/TAFFC.2016.2537327
  29. 29.
    Werner, P., Al-Hamadi, A., Niese, R., Walter, S., Gruss, S., Traue, H.: Towards pain monitoring: Facial expression, head pose, a new database, an automatic system and remaining challenges. In: Burghardt, T., Damen, D., Mayol-Cuevas, W., Mirmehdi, M. (eds.) Proceedings of the British Machine Vision Conference 2013, pp. 119.1–119.13. British Machine Vision Association.  https://doi.org/10.5244/C.27.119
  30. 30.
    Werner, P., Al-Hamadi, A., Walter, S.: Analysis of facial expressiveness during experimentally induced heat pain. In: 2017 Seventh International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), pp. 176–180. IEEE.  https://doi.org/10.1109/ACIIW.2017.8272610
  31. 31.
    Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58.  https://doi.org/10.1109/TPAMI.2008.52
  32. 32.
    Zhang, K., Zhang, Z., Li, Z., Qiao, Y.: Joint face detection and alignment using multi-task cascaded convolutional networks. IEEE Signal Process. Lett. 23(10), 1499–1503.  https://doi.org/10.1109/LSP.2016.2603342
  33. 33.
    Zhang, L., et al.: “BioVid Emo DB”: A multimodal database for emotion analyses validated by subjective ratings. In: 2016 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 1–6. IEEE.  https://doi.org/10.1109/SSCI.2016.7849931

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Jan Sellner
    • 1
  • Patrick Thiam
    • 1
  • Friedhelm Schwenker
    • 1
    Email author
  1. 1.Institute of Neural Information ProcessingUlm UniversityUlmGermany

Personalised recommendations