Modulation of Cognitive Goals and Sensorimotor Actions in Face-to-Face Communication by Emotional States: The Action-Based Approach

  • Bernd J. Kröger
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 26)


Cognitive goals – i.e. the intention to utter a sentence and to produce co-speech facial and hand-arm gestures – as well as the sensorimotor realization of the intended speech, co-speech facial, and co-speech hand-arm actions are modulated by the emotional state of the speaker. In this review paper it will be illustrated how cognitive goals and sensorimotor speech, co-speech facial, and co-speech hand-arm actions are modulated by emotional states of the speaker, how emotional states are perceived and recognized by interlocutors in the context of face-to-face communication, and which brain regions are responsible for production and perception of emotions in face-to-face communication.


face-to-face communication emotion speech facial expression gesture sensorimotor action emotional speech brain imaging fMRI 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Levelt, W.J.M., Roelofs, A., Meyer, A.S.: A theory of lexical access in speech production. Behavioral and Brain Sciences 22, 1–75 (1999)Google Scholar
  2. 2.
    Levelt, W.J.M.: Models of word production. Trends in Cognitive Sciences 3, 223–232 (1999)CrossRefMathSciNetGoogle Scholar
  3. 3.
    Guenther, F.H.: Cortical interactions underlying the production of speech sounds. Journal of Communication Disorders 39, 350–365 (2006)CrossRefGoogle Scholar
  4. 4.
    Guenther, F.H., Ghosh, S.S., Tourville, J.A.: Neural modeling and imaging of the cortical interactions underlying syllable production. Brain and Language 96, 280–301 (2006)CrossRefGoogle Scholar
  5. 5.
    Kröger, B.J., Kannampuzha, J., Neuschaefer-Rube, C.: Towards a neurocomputational model of speech production and perception. Speech Communication 51, 793–809 (2009)CrossRefGoogle Scholar
  6. 6.
    Halberstadt, J.B., Niedenthal, P.M., Kushner, J.: Resolution of lexical ambiguity by emotional state. Psychological Science 6, 278–282 (1995)CrossRefGoogle Scholar
  7. 7.
    Bänziger, T., Scherer, K.R.: The role of intonation on emotional expressions. Speech Communication 46, 252–267 (2005)CrossRefGoogle Scholar
  8. 8.
    Scherer, K.R.: Vocal communication of emotion: A review of research paradigms. Speech Communication 40, 227–256 (2003)CrossRefzbMATHGoogle Scholar
  9. 9.
    Ekman, P., Oster, H.: Facial expressions of emotion. Annual Review of Psychology 30, 527–554 (1979)CrossRefGoogle Scholar
  10. 10.
    Castellano, G., Villalba, S.D., Camurri, A.: Recognising human emotions from body movement and gesture dynamics. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds.) ACII 2007. LNCS (LNAI), vol. 4738, pp. 71–82. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  11. 11.
    Kröger, B.J., Kopp, S., Lowit, A.: A model for production, perception, and acquisition of actions in face-to-face communication. Cognitive Processing 11, 187–205 (2010)CrossRefGoogle Scholar
  12. 12.
    Kröger, B.J., Birkholz, P., Kaufmann, E., Neuschaefer-Rube, C.: Beyond vocal tract actions: speech prosody and co-verbal gesturing in face-to-face communication. In: Kröger, B.J., Birkholz, P. (eds.) Studientexte zur Sprachkommunikation: Elektronische Sprachsignalverarbeitung 2011, pp. 195–204. TUDpress, Dresden (2011)Google Scholar
  13. 13.
    Kendon, A.: Gesture: Visible Action as Utterance. Cambridge University Press, New York (2004)Google Scholar
  14. 14.
    Kopp, S., Wachsmuth, I.: Synthesizing multimodal utterances for conversational agents. Journal of Computer Animation and Virtual Worlds 15, 39–51 (2004)CrossRefGoogle Scholar
  15. 15.
    Ekman, P., Friesen, W.V.: Facial Action Coding System. Consulting Psychologists Press, Palo Alto (1978)Google Scholar
  16. 16.
    Cohn, J.F., Ambadar, Z., Ekman, P.: Observer-based measurement of facial expression with the facial action coding system. In: Coan, J.A., Allen, J.J.B. (eds.) Handbook of Emotion Elicitation and Assessment, pp. 203–221. Oxford University Press US, New York (2007)Google Scholar
  17. 17.
    Kröger, B.J., Birkholz, P., Kannampuzha, J., Kaufmann, E., Mittelberg, I.: Movements and holds in fluent sentence production of American Sign Language: The action-based approach. Cognitive Computation 3, 449–465 (2011)CrossRefGoogle Scholar
  18. 18.
    Kröger, B.J., Birkholz, P.: A gesture-based concept for speech movement control in articulatory speech synthesis. In: Esposito, A., Faundez-Zanuy, M., Keller, E., Marinaro, M. (eds.) COST Action 2102. LNCS (LNAI), vol. 4775, pp. 174–189. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  19. 19.
    Schmidt, K.L., Ambadar, Z., Cohn, J.F., Reed, L.I.: Movement differences between deliberate and spontaneous facial expressions: Zygomaticus major action in smiling. Journal of Nonverbal Behavior 30, 37–52 (2006)CrossRefGoogle Scholar
  20. 20.
    Arbib, M.A., Fellous, J.M.: Emotions: from brain to robot. Trends in Cognitive Sciences 8, 554–561 (2004)CrossRefGoogle Scholar
  21. 21.
    Breazeal, C.: Emotion and sociable humanoid robots. International Journal of Human-Computer Studies 59, 119–155 (2003)CrossRefGoogle Scholar
  22. 22.
    Ekman, P.: An argument for basic emotions. Cognition and Emotion 6, 169–200 (1992)CrossRefGoogle Scholar
  23. 23.
    LeDoux, J.E.: Emotion circuits in the brain. Annual Reviews of Neuroscience 23, 155–184 (2000)CrossRefGoogle Scholar
  24. 24.
    Lazarus, R.S.: Cognition and motivation in emotion. American Psychologist 46, 352–367 (1991)CrossRefGoogle Scholar
  25. 25.
    Pessoa, L., Adolphs, R.: Emotion processing and the amygdala: from a ‘low road’ to ‘many roads’ of evaluating biological significance. Nature Reviews Neuroscience 11, 773–782 (2010)CrossRefGoogle Scholar
  26. 26.
    Whalen, P.J., Raila, H., Bennett, R., Mattek, A., Brown, A., Taylor, J., van Tieghem, M., Tanner, A., Miner, M., Palme, A.: Neuroscience and facial expressions of emotion: the role of amygdala-prefrontal interactions. Emotion Review 5, 78–83 (2013)CrossRefGoogle Scholar
  27. 27.
    Brück, C., Kreifelts, B., Ethofer, T., Wildgruber, D.: Emotional voices: the tone of (true) feelings. In: Armony, J., Vuilleumier, P. (eds.) The Cambridge Handbook of Human Affective Neuroscience, pp. 256–285. Cambridge University Press, New York (2013)Google Scholar
  28. 28.
    Kesler-West, M.L., Andersen, A.H., Smith, C.D., Avison, M.J., Davis, C.E., Kryscio, R.J., Blonder, L.X.: Neural substrates of facial emotion processing using fMRI. Cognitive Brain Research 11, 213–226 (2001)CrossRefGoogle Scholar
  29. 29.
    Mitsuyoshi, S., Monnma, F., Tanaka, Y., Minami, T., Kato, M., Murata, T.: Identifying neural components of emotion in free conversation with fMRI. In: Defense Science Research Conference and Expo, Singapore, pp. 1–4 (2011), doi:10.1109/DSR.2011.6026845Google Scholar
  30. 30.
    Aziz-Zadeh, L., Sheng, T., Gheytanchi, A.: Common premotor regions for the perception and production of prosody and correlations with empathy and prosodic ability. PLoS ONE 5, e8759, 1-7 (2010), doi:10.1371/journal.pone.0008759Google Scholar
  31. 31.
    Bauer, D., Kannampuzha, J., Kröger, B.J.: Articulatory Speech Re-Synthesis: Profiting from natural acoustic speech data. In: Esposito, A., Vích, R. (eds.) Cross-Modal Analysis of Speech, Gestures, Gaze and Facial Expressions. LNCS, vol. 5641, pp. 344–355. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  32. 32.
    Martin, O., Kotsia, I., Macq, B., Pitas, I.: The eNTERFACE05 Audio-Visual Emotion Database. In: First IEEE Workshop on Multimedia Database Management, Atlanta, USA (2006), doi:10.1109/ICDEW.2006.145Google Scholar
  33. 33.
    Lücking, A., Bergmann, K., Hahn, F., Kopp, S., Rieser, H.: Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications. Journal on Multimodal User Interfaces (2012), doi:10.1007/s12193-012-0106-8Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Bernd J. Kröger
    • 1
    • 2
  1. 1.Neurophonetics Group, Department of Phoniatrics, Pedaudiology, and Communication Disorders, Medical SchoolRWTH Aachen UniversityAachenGermany
  2. 2.Cognitive Computation and Applications Laboratory, School of Computer Science and TechnologyTianjin UniversityTianjinChina

Personalised recommendations