Journal of Nonverbal Behavior

, Volume 29, Issue 4, pp 193–215

Prosody–face Interactions in Emotional Processing as Revealed by the Facial Affect Decision Task


DOI: 10.1007/s10919-005-7720-z

Cite this article as:
Pell, M.D. J Nonverbal Behav (2005) 29: 193. doi:10.1007/s10919-005-7720-z


Previous research employing the facial affect decision task (FADT) indicates that when listeners are exposed to semantically anomalous utterances produced in different emotional tones (prosody), the emotional meaning of the prosody primes decisions about an emotionally congruent rather than incongruent facial expression (Pell, M. D., Journal of Nonverbal Behavior, 29, 45–73). This study undertook further development of the FADT by investigating the approximate timecourse of prosody–face interactions in nonverbal emotion processing. Participants executed facial affect decisions about happy and sad face targets after listening to utterance fragments produced in an emotionally related, unrelated, or neutral prosody, cut to 300, 600, or 1000 ms in duration. Results underscored that prosodic information enduring at least 600 ms was necessary to presumably activate shared emotion knowledge responsible for prosody–face congruity effects.


emotion processing vocal expression facial expression speech communication affective priming social cognition 

Copyright information

© Springer Science+Business Media, Inc. 2005

Authors and Affiliations

  1. 1.School of Communication Sciences and DisordersMcGill UniversityMontréalCanada

Personalised recommendations