Existing sets of social and emotional stimuli suitable for social cognition research are limited in many ways, including size, unimodal stimulus delivery, and restriction to major universal emotions. Existing measures of social cognition could be improved by taking advantage of item response theory and adaptive testing technology to develop instruments that obtain more efficient measures of multimodal social cognition. However, for this to be possible, large pools of emotional stimuli must be obtained and validated. We present the development of a large, high-quality multimedia stimulus set produced by professional adult and child actors (ages 5 to 74) containing both visual and vocal emotional expressions. We obtained over 74,000 audiovisual recordings of a wide array of emotional and social behaviors, including the main universal emotions (happiness, sadness, anger, fear, disgust, and surprise), as well as more complex social expressions (pride, affection, sarcasm, jealousy, and shame). The actors generated a high quantity of technically superior, ecologically valid stimuli that were digitized, archived, and rated for accuracy and intensity of expressions. A subset of these facial and vocal expressions of emotion and social behavior were submitted for quantitative ratings to generate parameters for validity and discriminability. These stimuli are suitable for affective neuroscience-based psychometric tests, functional neuroimaging, and social cognitive rehabilitation programs. The purposes of this report are to describe the method of obtaining and validating this database and to make it accessible to the scientific community. We invite all those interested in participating in the use and validation of these stimuli to access them at www.med.upenn.edu/bbl/actors/index.shtml.
This is a preview of subscription content, log in to check access.
Buy single article
Instant access to the full article PDF.
Price includes VAT for USA
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
This is the net price. Taxes to be calculated in checkout.
Adams, R. B., & Kleck, R. E. (2003). Perceived gaze direction and the processing of facial displays of emotion. Psychological Science, 14, 644–647.
Adolphs, R., Sears, L., & Piven, J. (2001). Abnormal processing of social information from faces in autism. Journal of Cognitive Neuroscience, 13, 232–240.
Alba-Ferrara, L., Hausmann, M., Mitchell, R. L., & Weis, S. (2011). The neural correlates of emotional prosody comprehension: Disentangling simple from complex emotion. PLoS ONE, 6, e28701. doi:10.1371/journal.pone.0028701
Alfimova, M. V., Abramova, L. I., Barhatova, A. I., Yumatova, P. E., Lyachenko, G. L., & Golimbet, V. E. (2009). Facial affect recognition deficit as a marker of genetic vulnerability to schizophrenia. Spanish Journal of Psychology, 12, 46–55.
Capps, L., Yirmiya, N., & Sigman, M. (1992). Understanding of simple and complex emotions in non-retarded children with autism. Journal of Child Psychology and Psychiatry, 33, 1169–1182.
De Silva, L. C., Miyasato, T., & Nakatsu, R. (1997). Facial emotion recognition using multi-modal information. Information, Communications and Signal Processing, 1, 397–401.
Edwards, J., Jackson, H. J., & Pattison, P. E. (2002). Emotion recognition via facial expression and affective prosody in schizophrenia: A methodological review. Clinical Psychology Review, 22, 789–832.
Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect. Palo Alto: Consulting Psychologists Press.
Gur, R. C., Sara, R., Hagendoorn, M., Marom, O., Hughett, P., Turner, T., & Gur, R. E. (2002). A method for obtaining 3-dimensional facial expressions and its standardization for use in neurocognitive studies. Journal of Neuroscience Methods, 115, 137–143. doi:10.1016/S0165-0270(02)00006-7
Haskins, B., Shutty, J., & Kellogg, E. (1995). Affect processing in chronically psychotic patients: Development of a reliable assessment tool. Schizophrenia Research, 15, 291–297.
Heimberg, C., Gur, R. E., Erwin, R. J., Shtasel, D. L., & Gur, R. C. (1992). Facial emotion discrimination: III. Behavioral findings in schizophrenia. Psychiatry Research, 42, 253–265.
Hoekert, M., Kahn, R. S., Pijnenborg, M., & Aleman, A. (2007). Impaired recognition and expression of emotional prosody in schizophrenia: Review and meta-analysis. Clinical Psychology Review, 96, 135–145.
Izard, C. E. (1971). The face of emotion. New York: Appleton-Century-Crofts.
Kaulard, K., Cunningham, D. W., Bülthoff, H. H., & Wallraven, C. (2012). The MPI Facial Expression Database—A validated database of emotional and conversational facial expressions. PLoS ONE, 7, e32321. doi:10.1371/journal.pone.0032321
Kerr, S. L., & Neale, J. M. (1993). Emotion perception in schizophrenia-specific deficit or further evidence of generalized poor performance. Journal of Abnormal Psychology, 102, 312–318.
Leentjens, A., Wielaert, S., van Harskamp, F., & Wilmink, F. (1998). Disturbances of affective prosody in patients with schizophrenia: A cross sectional study. Journal of Neurology, Neurosurgery and Psychiatry, 64, 375–378.
Matsumoto, D., & Ekman, P. (1988). Japanese and Caucasian Facial Expressions of Emotion (JACFEE) and Neutral Focus (JACNeuF) [Slides]. San Francisco: Department of Psychology, San Francisco State University.
Mazurski, E. J., & Bond, N. W. (1993). A new series of slides depicting facial expressions of affect: A comparison with the pictures of facial affect series. Australian Journal of Psychology, 45, 41–47.
McCann, J., & Peppe, S. (2003). Prosody in autism spectrum disorders: A critical review. International Journal of Language and Communication Disorders, 38, 325–350.
Morris, J. S., deBonis, M., & Dolan, R. J. (2002). Human amygdala responses to fearful eyes. NeuroImage, 17, 214–222.
Mower, E., Mataric, M. J., & Narayanan, S. (2009). Human perception of audio–visual synthetic character emotion expression in the presence of ambiguous and conflicting information. IEEE Transactions on Multimedia, 11, 843–855.
Reise, S. P., & Waller, N. G. (2009). Item response theory and clinical measurement. Annual Review of Clinical Psychology, 5, 27–48.
Russ, J. B., Gur, R. C., & Bilker, W. B. (2008). Validation of affective and neutral sentence content for prosodic testing. Behavior Research Methods, 40, 935–939. doi:10.3758/BRM.40.4.935
Savla, G. N., Vella, L., Armstrong, C. C., Penn, D. L., & Twamley, E. W. (2012). Deficits in domains of social cognition in schizophrenia: A meta-analysis of the empirical evidence. Schizophrenia Bulletin, 39, 979–992. doi:10.1093/schbul/sbs080
Simon, D., Craig, K. D., Gosselin, F., Belin, P., & Rainville, P. (2008). Recognition and discrimination of prototypical dynamic expressions of pain and emotions. Pain, 1–2, 55–64.
Vuilleumier, P., Armony, J. L., Driver, J., & Dolan, R. J. (2003). Distinct spatial frequency sensitivities for processing faces and emotional expressions. Nature Neuroscience, 6, 624–631. doi:10.1038/nn1057
Whalen, P. J., Kagan, J., Cook, R. G., Davis, F. C., Kim, H., Polis, S., & Johnstone, T. (2004). Human amygdala responsivity to masked fearful eye whites. Science, 306, 2061. doi:10.1126/science.1103617
Williams, M. A., Morris, A. P., McGlone, F., Abbott, D. F., & Mattingley, J. B. (2004). Amygdala responses to fearful and happy facial expressions under conditions of binocular suppression. Journal of Neuroscience, 24, 2898–2904.
Winston, J. S., Vuilleumier, P., & Dolan, R. J. (2003). Effects of low-spatial frequency components of fearful faces on fusiform cortex activity. Current Biology, 13, 1824–1829.
This work was supported by Grant Nos. NIH R01-MH060722 and NIH R01-MH084856. We thank our directors, Amy Dugas Brown and David M. O’Connor, and the Brain Behavior Laboratory staff, who edited the stimuli and provided Web programming support. The authors also recognize the work of Raymond P. Hill, who contributed significantly to the manuscript. Ray passed away in January 2013.
Electronic supplementary material
Below is the link to the electronic supplementary material.
(PDF 64 kb)
About this article
Cite this article
Keutmann, M.K., Moore, S.L., Savitt, A. et al. Generating an item pool for translational social cognition research: Methodology and initial validation. Behav Res 47, 228–234 (2015). https://doi.org/10.3758/s13428-014-0464-0
- Social cognition