Skip to main content

Modulation of Cognitive Goals and Sensorimotor Actions in Face-to-Face Communication by Emotional States: The Action-Based Approach

  • Conference paper
Book cover Recent Advances of Neural Network Models and Applications

Part of the book series: Smart Innovation, Systems and Technologies ((SIST,volume 26))

  • 2125 Accesses

Abstract

Cognitive goals – i.e. the intention to utter a sentence and to produce co-speech facial and hand-arm gestures – as well as the sensorimotor realization of the intended speech, co-speech facial, and co-speech hand-arm actions are modulated by the emotional state of the speaker. In this review paper it will be illustrated how cognitive goals and sensorimotor speech, co-speech facial, and co-speech hand-arm actions are modulated by emotional states of the speaker, how emotional states are perceived and recognized by interlocutors in the context of face-to-face communication, and which brain regions are responsible for production and perception of emotions in face-to-face communication.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Levelt, W.J.M., Roelofs, A., Meyer, A.S.: A theory of lexical access in speech production. Behavioral and Brain Sciences 22, 1–75 (1999)

    Google Scholar 

  2. Levelt, W.J.M.: Models of word production. Trends in Cognitive Sciences 3, 223–232 (1999)

    Article  MathSciNet  Google Scholar 

  3. Guenther, F.H.: Cortical interactions underlying the production of speech sounds. Journal of Communication Disorders 39, 350–365 (2006)

    Article  Google Scholar 

  4. Guenther, F.H., Ghosh, S.S., Tourville, J.A.: Neural modeling and imaging of the cortical interactions underlying syllable production. Brain and Language 96, 280–301 (2006)

    Article  Google Scholar 

  5. Kröger, B.J., Kannampuzha, J., Neuschaefer-Rube, C.: Towards a neurocomputational model of speech production and perception. Speech Communication 51, 793–809 (2009)

    Article  Google Scholar 

  6. Halberstadt, J.B., Niedenthal, P.M., Kushner, J.: Resolution of lexical ambiguity by emotional state. Psychological Science 6, 278–282 (1995)

    Article  Google Scholar 

  7. Bänziger, T., Scherer, K.R.: The role of intonation on emotional expressions. Speech Communication 46, 252–267 (2005)

    Article  Google Scholar 

  8. Scherer, K.R.: Vocal communication of emotion: A review of research paradigms. Speech Communication 40, 227–256 (2003)

    Article  MATH  Google Scholar 

  9. Ekman, P., Oster, H.: Facial expressions of emotion. Annual Review of Psychology 30, 527–554 (1979)

    Article  Google Scholar 

  10. Castellano, G., Villalba, S.D., Camurri, A.: Recognising human emotions from body movement and gesture dynamics. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds.) ACII 2007. LNCS (LNAI), vol. 4738, pp. 71–82. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  11. Kröger, B.J., Kopp, S., Lowit, A.: A model for production, perception, and acquisition of actions in face-to-face communication. Cognitive Processing 11, 187–205 (2010)

    Article  Google Scholar 

  12. Kröger, B.J., Birkholz, P., Kaufmann, E., Neuschaefer-Rube, C.: Beyond vocal tract actions: speech prosody and co-verbal gesturing in face-to-face communication. In: Kröger, B.J., Birkholz, P. (eds.) Studientexte zur Sprachkommunikation: Elektronische Sprachsignalverarbeitung 2011, pp. 195–204. TUDpress, Dresden (2011)

    Google Scholar 

  13. Kendon, A.: Gesture: Visible Action as Utterance. Cambridge University Press, New York (2004)

    Google Scholar 

  14. Kopp, S., Wachsmuth, I.: Synthesizing multimodal utterances for conversational agents. Journal of Computer Animation and Virtual Worlds 15, 39–51 (2004)

    Article  Google Scholar 

  15. Ekman, P., Friesen, W.V.: Facial Action Coding System. Consulting Psychologists Press, Palo Alto (1978)

    Google Scholar 

  16. Cohn, J.F., Ambadar, Z., Ekman, P.: Observer-based measurement of facial expression with the facial action coding system. In: Coan, J.A., Allen, J.J.B. (eds.) Handbook of Emotion Elicitation and Assessment, pp. 203–221. Oxford University Press US, New York (2007)

    Google Scholar 

  17. Kröger, B.J., Birkholz, P., Kannampuzha, J., Kaufmann, E., Mittelberg, I.: Movements and holds in fluent sentence production of American Sign Language: The action-based approach. Cognitive Computation 3, 449–465 (2011)

    Article  Google Scholar 

  18. Kröger, B.J., Birkholz, P.: A gesture-based concept for speech movement control in articulatory speech synthesis. In: Esposito, A., Faundez-Zanuy, M., Keller, E., Marinaro, M. (eds.) COST Action 2102. LNCS (LNAI), vol. 4775, pp. 174–189. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  19. Schmidt, K.L., Ambadar, Z., Cohn, J.F., Reed, L.I.: Movement differences between deliberate and spontaneous facial expressions: Zygomaticus major action in smiling. Journal of Nonverbal Behavior 30, 37–52 (2006)

    Article  Google Scholar 

  20. Arbib, M.A., Fellous, J.M.: Emotions: from brain to robot. Trends in Cognitive Sciences 8, 554–561 (2004)

    Article  Google Scholar 

  21. Breazeal, C.: Emotion and sociable humanoid robots. International Journal of Human-Computer Studies 59, 119–155 (2003)

    Article  Google Scholar 

  22. Ekman, P.: An argument for basic emotions. Cognition and Emotion 6, 169–200 (1992)

    Article  Google Scholar 

  23. LeDoux, J.E.: Emotion circuits in the brain. Annual Reviews of Neuroscience 23, 155–184 (2000)

    Article  Google Scholar 

  24. Lazarus, R.S.: Cognition and motivation in emotion. American Psychologist 46, 352–367 (1991)

    Article  Google Scholar 

  25. Pessoa, L., Adolphs, R.: Emotion processing and the amygdala: from a ‘low road’ to ‘many roads’ of evaluating biological significance. Nature Reviews Neuroscience 11, 773–782 (2010)

    Article  Google Scholar 

  26. Whalen, P.J., Raila, H., Bennett, R., Mattek, A., Brown, A., Taylor, J., van Tieghem, M., Tanner, A., Miner, M., Palme, A.: Neuroscience and facial expressions of emotion: the role of amygdala-prefrontal interactions. Emotion Review 5, 78–83 (2013)

    Article  Google Scholar 

  27. Brück, C., Kreifelts, B., Ethofer, T., Wildgruber, D.: Emotional voices: the tone of (true) feelings. In: Armony, J., Vuilleumier, P. (eds.) The Cambridge Handbook of Human Affective Neuroscience, pp. 256–285. Cambridge University Press, New York (2013)

    Google Scholar 

  28. Kesler-West, M.L., Andersen, A.H., Smith, C.D., Avison, M.J., Davis, C.E., Kryscio, R.J., Blonder, L.X.: Neural substrates of facial emotion processing using fMRI. Cognitive Brain Research 11, 213–226 (2001)

    Article  Google Scholar 

  29. Mitsuyoshi, S., Monnma, F., Tanaka, Y., Minami, T., Kato, M., Murata, T.: Identifying neural components of emotion in free conversation with fMRI. In: Defense Science Research Conference and Expo, Singapore, pp. 1–4 (2011), doi:10.1109/DSR.2011.6026845

    Google Scholar 

  30. Aziz-Zadeh, L., Sheng, T., Gheytanchi, A.: Common premotor regions for the perception and production of prosody and correlations with empathy and prosodic ability. PLoS ONE 5, e8759, 1-7 (2010), doi:10.1371/journal.pone.0008759

    Google Scholar 

  31. Bauer, D., Kannampuzha, J., Kröger, B.J.: Articulatory Speech Re-Synthesis: Profiting from natural acoustic speech data. In: Esposito, A., Vích, R. (eds.) Cross-Modal Analysis of Speech, Gestures, Gaze and Facial Expressions. LNCS, vol. 5641, pp. 344–355. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  32. Martin, O., Kotsia, I., Macq, B., Pitas, I.: The eNTERFACE05 Audio-Visual Emotion Database. In: First IEEE Workshop on Multimedia Database Management, Atlanta, USA (2006), doi:10.1109/ICDEW.2006.145

    Google Scholar 

  33. Lücking, A., Bergmann, K., Hahn, F., Kopp, S., Rieser, H.: Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications. Journal on Multimodal User Interfaces (2012), doi:10.1007/s12193-012-0106-8

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Kröger, B.J. (2014). Modulation of Cognitive Goals and Sensorimotor Actions in Face-to-Face Communication by Emotional States: The Action-Based Approach. In: Bassis, S., Esposito, A., Morabito, F. (eds) Recent Advances of Neural Network Models and Applications. Smart Innovation, Systems and Technologies, vol 26. Springer, Cham. https://doi.org/10.1007/978-3-319-04129-2_38

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-04129-2_38

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-04128-5

  • Online ISBN: 978-3-319-04129-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics