Skip to main content
Log in

Human emotion and cognition recognition from body language of the head using soft computing techniques

  • Original Research
  • Published:
Journal of Ambient Intelligence and Humanized Computing Aims and scope Submit manuscript

Abstract

To make a computer interface more usable, enjoyable, and effective, it should be able to recognize emotions of its human counterpart. This paper explores new ways to infer the user’s emotions and cognitions from the combination of facial expression (happy, angry, or sad), eye gaze (direct or averted), and head movement (direction and frequency). All of the extracted information is taken as input data and soft computing techniques are applied to infer emotional and cognitional states. The fuzzy rules were defined based on the opinion of an expert in psychology, a pilot group and annotators. Although the creation of the fuzzy rules are specific to a given culture, the idea of integrating the different modalities of the body language of the head is generic enough to be used by any particular target user group from any culture. Experimental results show that this method can be used to successfully recognize 10 different emotions and cognitions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

References

  • Al-amri SS, Kalyankar NV, Khamitkar SD (2010) Linear and non-linear contrast enhancement image. Int J Comput Sci Netw Secur 10(2):139–143

    Google Scholar 

  • Arroyo I, Cooper DG, Burleson W, Woolf BP, Muldner K, Christopherson R (2009) Emotion sensors go to school. In: Proceedings of 14th conference on artificial intelligence in education, pp 17–24

  • Asteriadis S, Tzouveli P, Karpouzis K, Kollias S (2009) Estimation of behavioral user state based on eye gaze and head pose—application in an e-learning environment. J Multimedia Tools Appl 41(3). doi:10.1007/s11042-008-0240-1

  • Baenziger T, Grandjean D, Scherer KR (2009) Emotion recognition from expressions in face, voice, and body: the multimodal emotion recognition test (MERT). Emotion 9:691–704

    Article  Google Scholar 

  • Balomenos T, Raouzaiou A, Ioannou S, Drosopoulos A, Karpouzis K, Kollias S (2005) Emotion analysis in man–machine interaction system. LNCS 3361 3361:318–328

    Google Scholar 

  • Baron-Cohen S, Tead THE (2003) Mind reading: the interactive guide to emotion. Jessica Kingsley Publishers, London

    Google Scholar 

  • Brigham EO (2002) The fast Fourier transform. Prentice-Hall, New York

    Google Scholar 

  • Calvo RA, D’Mello S (2010) Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans Affect Comput 1:18–37. doi:10.1109/T-AFFC.2010.1

    Article  Google Scholar 

  • Castellano G, Kessous L, Caridakis G (2008) Emotion recognition through multiple modalities: face, body gesture, speech. Affect and emotion in human-computer interaction. Springer, Berlin, pp 92–103. doi:10.1007/978-3-540-85099-1_8

  • Chakraborty A, Konar A (2009) Emotion recognition from facial expressions and its control using fuzzy logic. IEEE Transact Syst Man Cybernet 39(4). doi:10.1109/TSMCA.2009.2014645

  • Chatterjee S, Hao S (2010) A novel neuro fuzzy approach to human emotion determination. Int Conf Digital Image Comput Techniq Appl 283–287. doi:10.1109/DICTA.2010.114

  • Cohn JF, Read LI, Ambadar Z, Xiao J, Moriyama T (2004) Automatic analysis and recognition of brow actions and head motion in spontaneous facial behavior. In: Proceedings of IEEE international conference systems, man, and cybernetics (SMC ‘04), vol 1, pp 610–616

  • Contreras R, Starostenko O, Alarcon-Aquino V, Flores-Pulido L (2010) Facial feature model for emotion recognition using fuzzy reasoning. Adv Pattern Recogn 6256:11–21. doi:10.1007/978-3-642-15992-3_2

    Article  Google Scholar 

  • D’Mello S, Graesser A (2010) Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features. User Model User-Adap Inter 10:147–187. doi:10.1007/s11257-010-9074-4

    Article  Google Scholar 

  • El Kaliouby R, Robinson P (2004) Real-time inference of complex mental states from facial expressions and head gestures. In: Proceedings international conference computer vision and pattern recognition 3:154. doi:10.1109/CVPR.2004.153

  • Engin M (2004) ECG beat classification using neuro-fuzzy network. Pattern Recogn Lett 25(15):1715–1722. doi:10.1016/j.patrec.2004.06.014

    Article  Google Scholar 

  • Esau N, Wetzel E, Kleinjohann L, Kleinjohann B (2007) Real-time facial expression recognition using a fuzzy emotion model. In: IEEE international conference fuzzy systems, 351–356. doi:10.1109/FUZZY.2007.4295451

  • Feng G (2006) A survey on analysis and design on model-based fuzzy control system. In: IEEE transaction on fuzzy systems, vol 14, issue 5. doi:10.1109/TFUZZ.2006.883415

  • Friedman J, Hastie T, Tibshirani R (2000) Additive logistic regression: a statistical view of boosting. Ann Stat 28(2):337–374

    Article  MathSciNet  MATH  Google Scholar 

  • Ganel T (2011) Revisiting the relationship between the processing of gaze direction and the processing of facial expression. J Exp Psychol Hum Percept Perform 37(1):48–57

    Article  Google Scholar 

  • Gegov A (2007) Complexity management in fuzzy systems. Studies in fuzziness and soft computing, vol 211. Springer, Berlin

  • Giripunje S, Bawane N (2007) ANFIS based emotions recognition in speech. Knowledge-based Intelligent Information and Engineering Systems, vol 4692. Springer-Verlag, Berlin, Heidelberg, pp 77–84

  • Gunes H, Pantic M (2010) Automatic, dimensional and continuous emotion recognition. Int J Synth Emot 1:68–99. doi:10.4018/jse.2010101605

    Article  Google Scholar 

  • Gunes H, Piccardi M (2005a) Affect recognition from face and body: early fusion versus late fusion. In: Proceedings of IEEE international conference systems, man, and cybernetics (SMC’05), pp 3437–3443. doi:10.1109/ICSMC

  • Gunes H, Piccardi M (2005b) Fusing face and body display for bi-modal emotion recognition: single frame analysis and multi-frame post-integration. In: Proceedings of first international conference affective computing and intelligent interaction, pp 102–111. doi:10.1007/11573548_14

  • Gunes H, Piccardi M (2009) From monomodal to multi-modal: affect recognition using visual modalities. Ambient intelligence techniques and applications. Springer-Verlag, Berlin, pp 154–182. doi:10.1007/978-1-84800-346-0_10

  • Harris C, Stephens M (1988) A combined corner and edge detector. In: Proceedings of the 4th Alvey Vision Conference, vol 15, pp 146–151

  • Ioannou S, Caridakis G, Karpouzis K, Kollias S (2007) Robust feature detection for facial expression recognition. J Image Video Process 2007(2). doi:10.1155/2007/29081

  • Jaimes A, Sebe N (2007) Multimodal human-computer interaction: a survey. Comput Vis Image Underst 108:116–134

    Article  Google Scholar 

  • Jang JSR (1993) ANFIS: adaptive-network-based fuzzy inference systems. IEEE Transact Syst Man Cybernet 23(3):665–685. doi:10.1109/21.256541

    Article  Google Scholar 

  • Ji Q, Lan P, Looney C (2006) A probabilistic framework for modeling and real-time monitoring human fatigue. IEEE Syst Man Cybernet Part A 36(5):862–875. doi:10.1109/TSMCA.2005.855922

    Article  Google Scholar 

  • Jones J, Palmer L (1978) An evaluation of the two-dimensional Gabor filter model of simple receptive fields in cat striate cortex. J Neurophysiol 58(6):1233–1258

    Google Scholar 

  • Kapoor A, Picard RW (2005) Multimodal affect recognition in learning environment. In: Proceedings of 13th annual ACM international conference multimedia, pp 677-682. doi:10.1145/1101149.1101300

  • Katsis CD, Katertsidis N, Ganiatsas G, Fotiadis DI (2008) Toward emotion recognition in car-racing drivers: a biosignal processing approach. IEEE Transact Syst Man Cybernet Part A Syst Humans 38(3), 502–512. doi:10.1109/TSMCA.2008.918624 

    Google Scholar 

  • Khezir M, Jahed M (2007) Real-time intelligent pattern recognition algorithm for surface EMG signals. BioMedical Engineering Online 6(1). doi:10.1186/1475-925X-6-45

  • Khezri M, Jahed M, Sadati N (2007) Neuro-fuzzy surface EMG pattern recognition for multifunctional hand prosthesis control. In: IEEE International symposium on industrial electronics, pp 269–274. doi:10.1109/ISIE.2007.4374610

  • Lee C, Narayanan S (2003) Emotion recognition using a data-driven fuzzy inference system. In: Proceedings of Eurospeech, Geneva, pp 157–160

  • Lyons M, Budynek J, Akamatsu S (1999) Automatic classification of single facial images. IEEE Trans Pattern Anal Mach Intell 21:1357–1362. doi:10.1109/34.817413

    Article  Google Scholar 

  • Mamdani EH, Assilian S (1975) An experiment in linguistic synthesis with a fuzzy logic controller. Int J Man Mach Stud 7:1–13. doi:10.1006/ijhc.1973.0303

    Google Scholar 

  • Mandryk RL, Atkins MS (2007) A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. Int J Hum Comput Stud 65(4):329–347. doi:10.1016/j.ijhcs.2006.11.011

    Article  Google Scholar 

  • Marsh AA, Elfenbein HA, Ambady N (2003) Nonverbal “accents”: cultural differences in facial expressions of emotion. Psychol Sci 14(4):373–376

    Article  Google Scholar 

  • Matsumoto D (1989) Cultural influences on the perception of emotion. J Cross Cult Psychol 20(1):92–105. doi:10.1177/0022022189201006

    Article  Google Scholar 

  • Matsumoto D (2001) The handbook of culture and psychology. Oxford University Press, USA

    Google Scholar 

  • Matsumoto D, Kasri F, Kooken K (1999) American-Japanese cultural differences in judgments of expression intensity and subjective experience. Cogn Emot 13(2):201–218. doi:10.1080/026999399379339

    Article  Google Scholar 

  • Pantic M, Bartlett MS (2007) Machine analysis of facial expressions: face recognition. I-Tech Education and Publishing, Vienna, pp 377–416

    Google Scholar 

  • Picard R (1997) Affective computing. MIT Press, Cambridge

    Google Scholar 

  • Reginald B, Jr Adams, Robert EK (2005) Effects of direct and averted gaze on the perception of facially communicated emotion. Emotion 5(1):3–11. doi:10.1037/1528-3542.5.1.3

    Article  Google Scholar 

  • Reginald B, Jr Adams, Robert G, Jr Franklin (2009) Influence of emotional expression on the processing of gaze direction. Motivat Emot 33(2):106–112. doi:10.1007/s11031-009-9121-9

    Article  Google Scholar 

  • Schalk J, Hawk ST, Fishcher AH, Doosje B (2011) Moving faces, looking places: validation of the Amsterdam dynamic facial expression set (ADFES). Emotion 11(4):907–920. doi:10.1037/a0023853

    Article  Google Scholar 

  • Scherer K, Ellgring H (2007) Multimodal expression of emotion: affect programs or componential appraisal pattern? Emotion 7:158–171. doi:10.1037/1528-3542.7.1.158

    Article  Google Scholar 

  • Sebe N, Cohen I, Gevers T, Huang TS (2005) Multimodal approaches for emotion recognition: a survey. Proc SPIE-IS&T Electronic Imag SPIE 5670:56–67. doi:10.1117/12.600746

    Article  Google Scholar 

  • Shi J, Tomasi C (1994) Good features to track. In: Proceedings of IEEE Computing Society Conference on Computing Vision and Pattern Recognition, pp 593–600. doi: 10.1109/CVPR.1994.323794 

  • Takagi T, Sugeno M (1985) Fuzzy identification of systems and its applications to modeling and control. IEEE Trans Syst Man Cybern 15(1):116–132

    Article  MATH  Google Scholar 

  • Taur JS, Tao CW (2000) A new neuro-fuzzy classifier with application to on-line face detection and recognition. J VLSI Sig Proc 26(3):397–409. doi:10.1023/A:1026515819538

    Article  MATH  Google Scholar 

  • Valstar MF, Gunes H, Pantic M (2007) How to distinguish posed from spontaneous smiles using geometric features. In: Proceedings of ninth ACM international conference on multimodal interfaces (ICMI ‘07), pp 38–45. doi:10.1145/1322192.1322202

  • Viola P, Jones M (2001) Robust real-time object detection. Cambrige Research Laboratory Technical Report Series CRL2001/01, pp 1–24

  • Vukadinovic D, Pantic M (2005) Fully automatic facial feature point detection using Gabor feature based boosted classifiers. IEEE Int Conf Syst Man Cybernet 2:1692–1698. doi:10.1109/ICSMC.2005.1571392

    Google Scholar 

  • Zadeh L (1973) Outline of a new approach to the analysis of complex systems and decision processess. IEEE Trans Syst Man Cybern SMC-3(1):28–44. doi:10.1109/TSMC.1973.5408575 

    Google Scholar 

  • Zeng Z, Hu Y, Roisman G, Wen Z, Fu Y, Huang T (2006) Audio-visual emotion recognition in adult attachment interview. In: Quek JYFKH, Massaro DW, Alwan AA, Hazen TJ (eds) Proceedings of international conference multimodal interfaces, pp 139–145. doi:10.1145/1180995.1181028

  • Zeng Z, Pantic M, Roisman GI, Huang TS (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58. doi:10.1109/TPAMI.2008.52

    Article  Google Scholar 

  • Zhang Y, Ji Q (2005) Active and dynamic information fusion for facial expression understanding from image sequences. IEEE Trans Pattern Anal Mach Intell 27(5):699–714. doi:10.1109/TPAMI.2005.93

    Article  Google Scholar 

  • Zhao Y (2009) Facial expression recognition by applying multi-step integral projection and SVMs. In: Proceedings of IEEE instrumentation and measurement technology conference, pp 686–691. doi:10.1109/IMTC.2009.5168537 

  • Zhu Z, Ji Q (2006) Robust real-time face pose and facial expression recovery. In: Proceedings of IEEE international conference computer vision and pattern recognition (CVPR ‘06) 1:681–688. doi:10.1109/CVPR.2006.259 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yisu Zhao.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Zhao, Y., Wang, X., Goubran, M. et al. Human emotion and cognition recognition from body language of the head using soft computing techniques. J Ambient Intell Human Comput 4, 121–140 (2013). https://doi.org/10.1007/s12652-012-0107-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12652-012-0107-1

Keywords

Navigation