Advertisement

Behavior Research Methods

, Volume 40, Issue 2, pp 531–539 | Cite as

The Montreal Affective Voices: A validated set of nonverbal affect bursts for research on auditory affective processing

  • Pascal Belin
  • Sarah Fillion-Bilodeau
  • Frédéric Gosselin
Article

Abstract

The Montreal Affective Voices consist of 90 nonverbal affect bursts corresponding to the emotions of anger, disgust, fear, pain, sadness, surprise, happiness, and pleasure (plus a neutral expression), recorded by 10 different actors (5 of them male and 5 female). Ratings of valence, arousal, and intensity for eight emotions were collected for each vocalization from 30 participants. Analyses revealed high recognition accuracies for most of the emotional categories (mean of 68%). They also revealed significant effects of both the actors’ and the participants’ gender: The highest hit rates (75%) were obtained for female participants rating female vocalizations, and the lowest hit rates (60%) for male participants rating male vocalizations. Interestingly, the mixed situations— that is, male participants rating female vocalizations or female participants rating male vocalizations—yielded similar, intermediate ratings. The Montreal Affective Voices are available for download at vnl.psy.gla.ac.uk/ (Resources section).

Keywords

Emotion Recognition Facial Action Code System Vocal Expression Emotional Prosody High Recognition Accuracy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Banse, R., & Scherer, K. R. (1996). Acoustic profiles in vocal emotion expression. Journal of Personality & Social Psychology, 70, 614–636.CrossRefGoogle Scholar
  2. Belin, P., Fecteau, S., & Bédard, C. (2004). Thinking the voice: Neural correlates of voice perception. Trends in Cognitive Sciences, 8, 129–135.PubMedCrossRefGoogle Scholar
  3. Bradley, M. M., & Lang, P. J. (1999). International affective digitized sounds (IADS): Stimuli, instruction manual and affective ratings (Tech. Rep. B-2). Gainesville: University of Florida, Center for Research in Psychophysiology.Google Scholar
  4. Brody, L. R., & Hall, J. A. (1993). Gender and emotion. In M. Lewis & J. M. Haviland (Eds.), Handbook of emotions (pp. 447–460). New York: Guilford.Google Scholar
  5. Bruce, V., & Young, A. (1986). Understanding face recognition. British Journal of Psychology, 77, 305–327.PubMedGoogle Scholar
  6. Buchanan, T. W., Lutz, K., Mirzazade, S., Specht, K., Shah, N. J., Zilles, K., & Jäncke, L. (2000). Recognition of emotional prosody and verbal components of spoken language: An fMRI study. Cognitive Brain Research, 9, 227–238.PubMedCrossRefGoogle Scholar
  7. Calder, A. J., Burton, A. M., Miller, P., Young, A. W., & Akamatsu, S. (2001). A principal component analysis of facial expressions. Vision Research, 41, 1179–1208.PubMedCrossRefGoogle Scholar
  8. Dailey, M., Cottrell, G. W., & Reilly, J. (2001). California facial expressions, CAFE: Unpublished digital images. San Diego: University of California, Computer Science and Engineering Department.Google Scholar
  9. Ekman, P., & Friesen, W. V. (1978). Facial action coding system: Investigator’s guide. Palo Alto, CA: Consulting Psychologists Press.Google Scholar
  10. Ekman, P., Friesen, W. V., & Ellsworth, P. (1972). Emotion in the human face: Guidelines for research and an integration of findings. Oxford: Pergamon.Google Scholar
  11. Ekman, P., Friesen, W. V., & Hager, J. C. (2002). Facial Action Coding System investigator’s guide. Salt Lake City, UT: A Human Face.Google Scholar
  12. Fecteau, S., Belin, P., Joanette, Y., & Armony, J. L. (2007). Amygdala responses to nonlinguistic emotional vocalizations. NeuroImage, 36, 480–487.PubMedCrossRefGoogle Scholar
  13. Fischer, A. (1993). Sex differences in emotionality: Fact or stereotype? Feminism & Psychology, 3, 303–318.CrossRefGoogle Scholar
  14. Friend, M. (2000). Developmental changes in sensitivity to vocal paralanguage. Developmental Science, 3, 148–162.CrossRefGoogle Scholar
  15. Grandjean, D., Sander, D., Pourtois, G., Schwartz, S., Seghier, M. L., Scherer, K. R., & Vuilleumier, P. (2005). The voices of wrath: Brain responses to angry prosody in meaningless speech. Nature Neuroscience, 8, 145–146.PubMedCrossRefGoogle Scholar
  16. Hall, J. A. (1978). Gender effects in decoding nonverbal cues. Psychological Bulletin, 85, 845–857.CrossRefGoogle Scholar
  17. Imaizumi, S., Mori, K., Kiritani, S., Kawashima, R., Sugiura, M., Fukuda, H., et al. (1997). Vocal identification of speaker and emotion activates different brain regions. NeuroReport, 8, 2809–2812.PubMedCrossRefGoogle Scholar
  18. Juslin, P. N., & Laukka, P. (2003). Communication of emotions in vocal expression and music performance: Different channels, same code? Psychological Bulletin, 129, 770–814.PubMedCrossRefGoogle Scholar
  19. Kawahara, H., Katayose, H., de Cheveigne, A., & Patterson, R. D. (1999, September). Fixed point analysis of frequency to instantaneous frequency mapping for accurate estimation of f0 and periodicity. Paper presented at the 6th European Conference on Speech Communication and Technology (Eurospeech’99), Budapest.Google Scholar
  20. Kotz, S. A., Meyer, M., Alter, K., Besson, M., von Cramon, D. Y., & Friederici, A. D. (2003). On the lateralization of emotional prosody: An event-related functional MR investigation. Brain & Language, 86, 366–376.CrossRefGoogle Scholar
  21. Krolak-Salmon, P., Fischer, C., Vighetto, A., & Mauguière, F. (2001). Processing of facial emotional expression: Spatio-temporal data as assessed by scalp event-related potentials. European Journal of Neuroscience, 13, 987–994.PubMedCrossRefGoogle Scholar
  22. Lang, P. J. (1995). The emotion probe: Studies of motivation and attention. American Psychologist, 50, 372–385.PubMedCrossRefGoogle Scholar
  23. Lang, P. [J.], Öhman, A., & Vaitl, D. (1988). The international affective picture system. Gainesville: University of Florida, Center for Research in Psychophysiology.Google Scholar
  24. Laukka, P. (2005). Categorical perception of vocal emotion expressions. Emotion, 5, 277–295.PubMedCrossRefGoogle Scholar
  25. McNally, R. J., Otto, M. W., & Hornig, C. D. (2001). The voice of emotional memory: Content-filtered speech in panic disorder, social phobia, and major depressive disorder. Behaviour Research & Therapy, 39, 1329–1337.CrossRefGoogle Scholar
  26. Mitchell, R. L., Elliott, R., Barry, M., Cruttenden, A., & Woodruff, P. W. (2003). The neural response to emotional prosody, as revealed by functional magnetic resonance imaging. Neuropsychologia, 41, 1410–1421.PubMedCrossRefGoogle Scholar
  27. Monrad-Krohn, G. H. (1963). The third element of speech: Prosody and its disorders. In L. Halpern (Ed.), Problems of dynamic neurology (pp. 101–117). Jerusalem: Hebrew University Press.Google Scholar
  28. Morris, J. S., Frith, C. D., Perrett, D. I., Rowland, D., Young, A. W., Calder, A. J., & Dolan, R. J. (1996). A differential neural response in the human amygdala to fearful and happy facial expressions. Nature, 383, 812–815.PubMedCrossRefGoogle Scholar
  29. Murray, I. R., & Arnott, J. L. (1993). Toward the simulation of emotion in synthetic speech: A review of the literature on human vocal emotion. Journal of the Acoustical Society of America, 93, 1097–1108.PubMedCrossRefGoogle Scholar
  30. Pell, M. D. (2006). Cerebral mechanisms for understanding emotional prosody in speech. Brain & Language, 96, 221–234.CrossRefGoogle Scholar
  31. Russell, J. A., Bachorowski, J.-A., & Fernández-Dols, J.-M. (2003). Facial and vocal expressions of emotions. Annual Review of Psychology, 54, 329–349.PubMedCrossRefGoogle Scholar
  32. Scherer, K. R. (1986). Vocal affect expression: A review and a model for future research. Psychological Bulletin, 99, 143–165.PubMedCrossRefGoogle Scholar
  33. Scherer, K. R. (1995). Expression of emotion in voice and music. Journal of Voice, 9, 235–248.PubMedCrossRefGoogle Scholar
  34. Scherer, K. R., Banse, R., & Wallbott, H. G. (2001). Emotion inferences from vocal expression correlate across languages and cultures. Journal of Cross-Cultural Psychology, 32, 76–92.CrossRefGoogle Scholar
  35. Scherer, K. R., Ladd, D. R., & Silverman, K. E. A. (1984). Vocal cues to speaker affect: Testing two models. Journal of the Acoustical Society of America, 76, 1346–1356.CrossRefGoogle Scholar
  36. Schirmer, A., & Kotz, S. A. (2006). Beyond the right hemisphere: Brain mechanisms mediating vocal emotional processing. Trends in Cognitive Sciences, 10, 24–30.PubMedCrossRefGoogle Scholar
  37. Schirmer, A., Kotz, S. A., & Friederici, A. D. (2005). On the role of attention for the processing of emotions in speech: Sex differences revisited. Cognitive Brain Research, 24, 442–452.PubMedCrossRefGoogle Scholar
  38. Schlosberg, H. (1954). Three dimensions of emotion. Psychological Review, 61, 81–88.PubMedCrossRefGoogle Scholar
  39. Schröder, M. (2003). Experimental study of affect bursts. Speech Communication, 40, 99–116.CrossRefGoogle Scholar
  40. Simon, D., Craig, K. D., Gosselin, F., Belin, P., & Rainville, P. (2008). Recognition and discrimination of prototypical dynamic expressions of pain and emotions. Pain, 135, 55–64.PubMedCrossRefGoogle Scholar
  41. Smith, M. L., Cottrell, G. W., Gosselin, F., & Schyns, P. G. (2005). Transmitting and decoding facial expressions. Psychological Science, 16, 184–189.PubMedCrossRefGoogle Scholar
  42. Vuilleumier, P., Armony, J. L., Driver, J., & Dolan, R. J. (2001). Effects of attention and emotion on face processing in the human brain: An event-related fMRI study. Neuron, 30, 829–841.PubMedCrossRefGoogle Scholar
  43. Young, A. W., Rowland, D., Calder, A. J., Etcoff, N. L., Seth, A., & Perrett, D. I. (1997). Facial expression megamix: Tests of dimensional and category accounts of emotion recognition. Cognition, 63, 271–313.PubMedCrossRefGoogle Scholar

Copyright information

© Psychonomic Society, Inc. 2008

Authors and Affiliations

  • Pascal Belin
    • 1
    • 2
  • Sarah Fillion-Bilodeau
    • 2
  • Frédéric Gosselin
    • 2
  1. 1.Department of PsychologyUniversity of GlasgowGlasgowScotland
  2. 2.Université de MontréalMontréalCanada

Personalised recommendations