Skip to main content
Log in

Audiomotor integration of angry and happy prosodies

  • Original Article
  • Published:
Psychological Research Aims and scope Submit manuscript

Abstract

Different parts of our brain code the perceptual features and actions related to an object, causing a binding problem: how does the brain discriminate the information of a particular event from the features of other events? Hommel (1998) suggested the event file concept: an episodic memory trace binding perceptual and motor information pertaining to an object. By adapting Hommel’s paradigm to emotional faces in a previous study (Coll & Grandjean, 2016), we demonstrated that emotion could take part in an event file with motor responses. We also postulate such binding to occur with emotional prosodies, due to an equal importance of automatic reactions to such events. However, contrary to static emotional expressions, prosodies develop through time and temporal dynamics may influence the integration of these stimuli. To investigate this effect, we developed three studies with task-relevant and -irrelevant emotional prosodies. Our results showed that emotion could interact with motor responses when it was task relevant. When it was task irrelevant, this integration was also observed, but only when participants were led to focus on the details of the voices, that is, in a loudness task. No such binding was observed when participants performed a location task, in which emotion could be ignored. These results indicate that emotional binding is not restricted to visual information but is a general phenomenon allowing organisms to integrate emotion and action in an efficient and adaptive way. We discuss the influence of temporal dynamics in the emotion–action binding and the implication of Hommel’s paradigm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Notes

  1. We report the effect sizes according to the approach of Nakagawa and Schielzeth (2013), which is implemented in the MuMIn R package. The authors developed their approach on the basis of 2 indicators, a marginal and conditional R2 (R2m and R2c, respectively), allowing comparability with standard methods, while taking into account the variance explained by random effects. R2m is the variance explained by the fixed factors, whereas R2c is the variance explained by the entire model (both fixed and random effects). We calculated these variances for each effect in our statistical models.

  2. The mean intensity ratings for anger, happy, and neutral prosodies of Fruhholz et al. (2014) were 51.92 (SD = 5.63), 48.30 (SD = 10.94) and 21.89 (SD = 3.82), respectively. These prosodies were recognized as representing anger at 55.91 (SD = 10.86), 7.64 (SD = 4.52) and 2.85 (SD = 1.96), in average respectively. They were also recognized as representing happiness at 0.83 (SD = 1.53), 34.74 (SD = 21.53) and 0.25 (SD = 0.48), in average respectively. Finally, they were recognized as representing neutrality at 5.57 (SD = 3.94), 8.96 (SD = 9.69) and 70.11 (SD = 6.50), in average respectively.

References

  • Algazi, V. R., Duda, R. O., Thompson, D. M., & Avendano, C. (2001). The CIPIC HRTF database. In IEEE Workshop on Applications of Signal Processing to Audio and Acoustics. New Paltz (pp. 99–102).

  • Bolker, B. M., Brooks, M. E., Clark, C. J., Geange, S. W., Poulsen, J. R., Stevens, M. H. H., & White, J. S. S. (2009). Generalized linear mixed models: A practical guide for ecology and evolution. Trends in Ecology and Evolution, 24(3), 127–135.

    Article  PubMed  Google Scholar 

  • Burra, N., Barras, C., Coll, S. Y., & Kerzel, D. (2016). Electrophysiological evidence for attentional capture by irrelevant angry facial expressions. Biological Psychology, 120, 69–80.

    Article  PubMed  Google Scholar 

  • Coelho, C. M., Cloete, S., & Wallis, G. (2010). The face-in-the-crowd effect: When angry faces are just cross(es). Journal of Vision, 10(1), 7.

    Article  PubMed  Google Scholar 

  • Coll, S. C., & Grandjean, D. (2016). Visuomotor integration of relevant and irrelevant angry and fearful facial expressions. Acta Psychologica, 170, 226–238.

    Article  PubMed  Google Scholar 

  • Colzato, L. S., van Wouwe, N. C., & Hommel, B. (2007). Feature binding and affect: Emotional modulation of visuo-motor integration. Neuropsychologia, 45(2), 440–446.

    Article  PubMed  Google Scholar 

  • Costanzo, F. S., Markel, N. N., & Costanzo, P. R. (1969). Voice quality profile and perceived emotion. Journal of Counseling Psychology, 16(3), 267–270.

    Article  Google Scholar 

  • Eder, A. B., & Klauer, K. C. (2007). Common valence coding in action and evaluation: Affective blindness towards response-compatible stimuli. Cognition and Emotion, 21(6), 1297–1322.

    Article  Google Scholar 

  • Eder, A. B., & Klauer, K. C. (2009). A common-coding account of the bidirectional evaluation–behavior link. Journal of Experimental Psychology General, 138(2), 218–235.

    Article  PubMed  Google Scholar 

  • Eder, A. B., Musseler, J., & Hommel, B. (2012). The structure of affective action representations: Temporal binding of affective response codes. Psychological Research Psychologische Forschung, 76(1), 111–118.

    Article  PubMed  Google Scholar 

  • Fox, E., & Damjanovic, L. (2006). The eyes are sufficient to produce a threat superiority effect. Emotion, 6(3), 534–539.

    Article  PubMed  PubMed Central  Google Scholar 

  • Frijda, N. (1986). The emotions. Cambridge: Cambridge University Press.

    Google Scholar 

  • Frijda, N. (2007). The laws of emotion. Mahwath: Erlbaum.

    Google Scholar 

  • Frings, C., Rothermund, K., & Wentura, D. (2007). Distractor repetitions retrieve previous responses to targets. The Quarterly Journal of Experimental Psychology, 60(10), 1367–1377.

    Article  PubMed  Google Scholar 

  • Fruhholz, S., Klaas, H. S., Patel, S., & Grandjean, D. (2014). Talking in fury: The cortico-subcortical network underlying angry vocalizations. Cerebral Cortex, 25(9), 2752–2762.

    Article  PubMed  Google Scholar 

  • Giesen, C., & Rothermund, K. (2011). Affective matching moderates S-R binding. Cognition & Emotion, 25(2), 342–350.

    Article  Google Scholar 

  • Grandjean, D., Sander, D., & Scherer, K. R. (2008). Conscious emotional experience emerges as a function of multilevel, appraisal-driven response synchronization. Consciousness and Cognition, 17(2), 484–495.

    Article  PubMed  Google Scholar 

  • Green, P., & Macleod, C. J. (2016). SIMR: An R package for power analysis of generalised linear mixed models by simulation. Methods in Ecology and Evolution, 7(4), 493–498.

    Article  Google Scholar 

  • Hansen, C. H., & Hansen, R. D. (1988). Finding the face in the crowd: An anger superiority effect. Journal of Personality and Social Psychology, 54(6), 917–924.

    Article  PubMed  Google Scholar 

  • Hommel, B. (1995). Stimulus–response compatibility and the Simon effect: Toward an empirical clarification. Journal of Experimental Psychology: Human Perception and Performance, 21(4), 764–775.

    Google Scholar 

  • Hommel, B. (1998). Event files: Evidence for automatic integration of stimulus–response episodes. Visual Cognition, 5(1–2), 183–216.

    Article  Google Scholar 

  • Hommel, B. (2004). Event files: Feature binding in and across perception and action. Trends in Cognitive Sciences, 8(11), 494–500.

    Article  PubMed  Google Scholar 

  • Hommel, B., Memelink, J., Zmigrod, S., & Colzato, L. S. (2014). Attentional control of the creation and retrieval of stimulus–response bindings. Psychological Research, 78(4), 520–538.

    Article  PubMed  Google Scholar 

  • Huttar, G. L. (1968). Relations between prosodic variables and emotions in normal American English utterances. Journal of Speech, Language and Hearing Research, 11(3), 481–487.

    Article  Google Scholar 

  • Kahneman, D., Treisman, A., & Gibbs, B. J. (1992). The reviewing of object files: Object-specific integration of information. Cognitive Psychology, 24(2), 175–219.

    Article  PubMed  Google Scholar 

  • Kensinger, E. A., & Schacter, D. L. (2006). When the Red Sox shocked the Yankees: Comparing negative and positive memories. Psychonomic Bulletin and Review, 13(5), 757–763.

    Article  PubMed  Google Scholar 

  • Lang, P., & Bradley, M. (2010). Emotion and the motivational brain. Biological Psychology, 84(3), 437–450.

    Article  PubMed  Google Scholar 

  • Lavender, T., & Hommel, B. (2007). Affect and action: Towards an event-coding account. Cognition and Emotion, 21(6), 1270–1296.

    Article  Google Scholar 

  • Ledoux, J. E. (1994). Emotion, memory and the brain. Scientific American, 270(6), 50–57.

    Article  PubMed  Google Scholar 

  • McCulloch, C. E. (2003). Generalized linear mixed models. NSF-CBMS Regional Conference Series in Probability and Statistics 7. Beachwood: Institute of Mathematical Statistics.

  • Memelink, J., & Hommel, B. (2013). Intentional weighting: A basic principle in cognitive control. Psychological Research Psychologische Forschung, 77(3), 249–259.

    Article  PubMed  Google Scholar 

  • Moeller, B., & Frings, C. (2014). Attention meets binding: Only attended distractors are used for the retrieval of event files. Attention, Perception and Psychophysics, 76(4), 959–978.

    Article  PubMed  Google Scholar 

  • Moeller, B., Frings, C., & Pfister, R. (2016). The structure of distractor–response bindings: Conditions for configural and elemental integration. Journal of Experimental Psychology: Human Perception and Performance, 42(4), 464–479.

    PubMed  Google Scholar 

  • Morey, R. D., & Rouder, J. N. (2011). Bayes factor approaches for testing interval null hypotheses. Psychological Methods, 16(4), 406–419.

    Article  PubMed  Google Scholar 

  • Nakagawa, S., & Schielzeth, H. (2013). A general and simple method for obtaining R2 from generalized linear mixed-effects models. Methods in Ecology and Evolution, 4(2), 133–142.

    Article  Google Scholar 

  • Niedenthal, P. (2007). Embodying emotion. Science, 316(5827), 1002–1005.

    Article  PubMed  Google Scholar 

  • Ohman, A., Lundqvist, D., & Esteves, F. (2001). The face in the crowd revisited: A threat advantage with schematic stimuli. Journal of Personality and Social Psychology, 80(3), 381–396.

    Article  PubMed  Google Scholar 

  • Oldfield, R. C. (1971). The assessment and analysis of handedness: The Edinburgh Inventory. Neuropsychologia, 9(1), 97–113.

    Article  PubMed  Google Scholar 

  • Phaf, R. H., Mohr, S. E., Rotteveel, M., & Wicherts, J. M. (2014). Approach, avoidance, and affect: A meta-analysis of approach-avoidance tendencies in manual reaction time tasks. Frontiers in Psychology, 5, 378.

    Article  PubMed  PubMed Central  Google Scholar 

  • Pinkham, A. E., Griffin, M., Baron, R., Sasson, N. J., & Gur, R. C. (2010). The face in the crowd effect: Anger superiority when using real faces and multiple identities. Emotion, 10(1), 141–146.

    Article  PubMed  Google Scholar 

  • Scherer, K. R. (2001). Appraisal considered as a process of multilevel sequential checking. In K. R. Scherer, A. Schorr & T. Johnstone (Eds.), Appraisal processes in emotion: Theory, methods, research (pp. 92–120). New York: Oxford University Press.

    Google Scholar 

  • Simon, J. R., & Rudell, A. P. (1967). Auditory S–R compatibility: The effect of an irrelevant cue on information processing. Journal of Applied Psychology, 51(3), 300–304.

    Article  PubMed  Google Scholar 

  • Talarico, J. M., & Rubin, D. C. (2003). Confidence, not consistency, characterizes flashbulb memories. Psychological Science, 14(5), 455–461.

    Article  PubMed  Google Scholar 

  • Treisman, A. (1996). The binding problem. Current Opinion in Neurobiology, 6(2), 171–178.

    Article  PubMed  Google Scholar 

  • Waszak, F., & Pholulamdeth, V. (2009). Episodic S–R bindings and emotion: About the influence of positive and negative action effects on stimulus–response associations. Experimental Brain Research, 194(3), 489–494.

    Article  PubMed  Google Scholar 

  • Weymar, M., Low, A., Ohman, A., & Hamm, A. O. (2011). The face is more than its parts: Brain dynamics of enhanced spatial attention to schematic threat. NeuroImage, 58(3), 946–954.

    Article  PubMed  PubMed Central  Google Scholar 

  • Zmigrod, S., & Hommel, B. (2009). Auditory event files: Integrating auditory perception and action planning. Attention Perception and Psychophysics, 71(2), 352–362.

    Article  Google Scholar 

  • Zmigrod, S., & Hommel, B. (2010). Temporal dynamics of unimodal and multimodal feature binding. Attention Perception and Psychophysics, 72(1), 142–152.

    Article  Google Scholar 

  • Zmigrod, S., Spape, M., & Hommel, B. (2009). Intermodal event files: Integrating features across vision, audition, taction, and action. Psychological Research Psychologische Forschung, 73(5), 674–684.

    Article  PubMed  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sélim Yahia Coll.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the Ethical Standards of the Institutional Research Committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (DOCX 1209 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Coll, S.Y., Frühholz, S. & Grandjean, D. Audiomotor integration of angry and happy prosodies. Psychological Research 83, 1640–1655 (2019). https://doi.org/10.1007/s00426-018-1020-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00426-018-1020-9

Navigation