Bio-Visual Fusion for Person-Independent Recognition of Pain Intensity

  • Markus Kächele
  • Philipp Werner
  • Ayoub Al-Hamadi
  • Günther Palm
  • Steffen Walter
  • Friedhelm Schwenker
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9132)


In this work, multi-modal fusion of video and biopotential signals is used to recognize pain in a person-independent scenario. For this purpose, participants were subjected to painful heat stimuli under controlled conditions. Subsequently, a multitude of features have been extracted from the available modalities. Experimental validation suggests that the cues that allow the successful recognition of pain are highly similar across different people and complementary in the analysed modalities to an extent that fusion methods are able to achieve an improvement over single modalities. Different fusion approaches (early, late, trainable) are compared on a large set of state-of-the art features for the biopotentials and video channels in multiple classification experiments.


Feature Selection Facial Expression Empirical Mode Decomposition Skin Conductance Level Late Fusion 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This paper is based on work done within the Transregional Collaborative Research Centre SFB/TRR 62 Companion-Technology for Cognitive Technical Systems funded by the German Research Foundation (DFG). Markus Kächele is supported by a scholarship of the Landesgraduiertenförderung Baden-Württemberg at Ulm University.

This work was performed on the computational resource bwUniCluster funded by the Ministry of Science, Research and Arts and the Universities of the State of Baden-Württemberg, Germany, within the framework program bwHPC.


  1. 1.
    Andrade, A.O., Kyberd, P., Nasuto, S.J.: The application of the Hilbert spectrum to the analysis of electromyographic signals. Inf. Sci. 178(9), 2176–2193 (2008)CrossRefMathSciNetGoogle Scholar
  2. 2.
    Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)CrossRefzbMATHGoogle Scholar
  3. 3.
    Cao, C., Slobounov, S.: Application of a novel measure of EEG non-stationarity as ‘Shannon-entropy of the peak frequency shifting’ for detecting residual abnormalities in concussed individuals. Clin. Neurophysiol. Official J. Int. Fed. Clin. Neurophysiol. 122(7), 1314–1321 (2011)CrossRefGoogle Scholar
  4. 4.
    Hammal, Z., Cohn, J.F.: Automatic detection of pain intensity. In: Proceedings of the 14th ACM International Conference on Multimodal Interaction, ICMI 2012, pp. 47–52. ACM, New York (2012)Google Scholar
  5. 5.
    Kächele, M., Glodek, M., Zharkov, D., Meudt, S., Schwenker, F.: Fusion of audio-visual features using hierarchical classifier systems for the recognition of affective states and the state of depression. In: De Marsico, M., Tabbone, A., Fred, A. (eds.) Proceedings of the International Conference on Pattern Recognition Applications and Methods (ICPRAM), pp. 671–678. SciTePress (2014)Google Scholar
  6. 6.
    Kächele, M., Schels, M., Schwenker, F.: Inferring depression and affect from application dependent meta knowledge. In: Proceedings of the 4th International Workshop on Audio/Visual Emotion Challenge, AVEC 2014, pp. 41–48. ACM, New York (2014)Google Scholar
  7. 7.
    Kächele, M., Schwenker, F.: Cascaded fusion of dynamic, spatial, and textural feature sets for person-independent facial emotion recognition. In: Proceedings of the International Conference on Pattern Recognition (ICPR), pp. 4660–4665 (2014)Google Scholar
  8. 8.
    Kuncheva, L.: Combining Pattern Classifiers: Methods and Algorithms. Wiley, Hoboken (2004)CrossRefGoogle Scholar
  9. 9.
    Lucey, P., Cohn, J.F., Prkachin, K.M., Solomon, P.E., Matthews, I.: Painful data: the UNBC-McMaster shoulder pain expression archive database. Image Vis. Comput. J. 30, 197–205 (2012)CrossRefGoogle Scholar
  10. 10.
    Meudt, S., Zharkov, D., Kächele, M., Schwenker, F.: Multi classifier systems and forward backward feature selection algorithms to classify emotional coloured speech. In: Proceedings of the International Conference on Multimodal Interaction, ICMI 2013, pp. 551–556. ACM, New York (2013)Google Scholar
  11. 11.
    Peng, H., Long, F., Ding, C.: Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27(8), 1226–1238 (2005)CrossRefGoogle Scholar
  12. 12.
    Platt, J.C.: Fast Training of Support Vector Machines Using Sequential Minimal Optimization, pp. 185–208. MIT Press, Cambridge (1999)Google Scholar
  13. 13.
    Pudil, P., Novovičová, J., Kittler, J.: Floating search methods in feature selection. Pattern Recogn. Lett. 15(11), 1119–1125 (1994)CrossRefGoogle Scholar
  14. 14.
    Richman, J.S., Moorman, J.R.: Physiological time-series analysis using approximate entropy and sample entropy. Cardiovasc. Res. 278(6), 2039–2049 (2000)Google Scholar
  15. 15.
    Schwenker, F., Dietrich, C.R., Thiel, C., Palm, G.: Learning of decision fusion mappings for pattern recognition. Int. J. Artif. Intell. Mach. Learn. (AIML) 6, 17–21 (2006)Google Scholar
  16. 16.
    Stratou, G., Scherer, S., Gratch, J., Morency, L.P.: Automatic nonverbal behavior indicators of depression and PTSD: exploring gender differences. In: Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), pp. 147–152 (2013)Google Scholar
  17. 17.
    Treister, R., Kliger, M., Zuckerman, G., Aryeh, I.G., Eisenberg, E.: Differentiating between heat pain intensities: the combined effect of multiple autonomic parameters. Pain 153(9), 1807–1814 (2012)CrossRefGoogle Scholar
  18. 18.
    Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)zbMATHGoogle Scholar
  19. 19.
    Walter, S., Gruss, S., Ehleiter, H., Tan, J., Traue, H., Werner, P., Al-Hamadi, A., Crawcour, S., Andrade, A., Moreira da Silva, G.: The BioVid heat pain database data for the advancement and systematic validation of an automated pain recognition system. In: 2013 IEEE International Conference on Cybernetics (CYBCONF), pp. 128–131, June 2013Google Scholar
  20. 20.
    Walter, S., Gruss, S., Limbrecht-Ecklundt, K., Traue, H.C., Werner, P., Al-Hamadi, A., Diniz, N., da Silva, G.M., Andrade, A.O.: Automatic pain quantification using autonomic parameters. Psychol. Neurosci. 7, 363–380 (2014)CrossRefGoogle Scholar
  21. 21.
    Werner, P., Al-Hamadi, A., Niese, R., Walter, S., Gruss, S., Traue, H.C.: Towards pain monitoring: facial expression, head pose, a new database, an automatic system and remaining challenges. In: Proceedings of the British Machine Vision Conference, pp. 119.1–119.13. BMVA Press (2013)Google Scholar
  22. 22.
    Werner, P., Al-Hamadi, A., Niese, R., Walter, S., Gruss, S., Traue, H.C.: Automatic pain recognition from video and biomedical signals. In: International Conference on Pattern Recognition, pp. 4582–4587 (2014)Google Scholar
  23. 23.
    Xiong, X., De la Torre, F.: Supervised descent method and its applications to face alignment. In: 2013 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 532–539 (2013)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Markus Kächele
    • 1
  • Philipp Werner
    • 2
  • Ayoub Al-Hamadi
    • 2
  • Günther Palm
    • 1
  • Steffen Walter
    • 3
  • Friedhelm Schwenker
    • 1
  1. 1.Institute of Neural Information ProcessingUlm UniversityUlmGermany
  2. 2.Institute of Information TechnologyUniversity of MagdeburgMagdeburgGermany
  3. 3.Department of Psychosomatic Medicine and PsychotherapyUlm UniversityUlmGermany

Personalised recommendations