Behavior Research Methods

, Volume 47, Issue 1, pp 228–234 | Cite as

Generating an item pool for translational social cognition research: Methodology and initial validation

  • Michael K. KeutmannEmail author
  • Samantha L. Moore
  • Adam Savitt
  • Ruben C. Gur


Existing sets of social and emotional stimuli suitable for social cognition research are limited in many ways, including size, unimodal stimulus delivery, and restriction to major universal emotions. Existing measures of social cognition could be improved by taking advantage of item response theory and adaptive testing technology to develop instruments that obtain more efficient measures of multimodal social cognition. However, for this to be possible, large pools of emotional stimuli must be obtained and validated. We present the development of a large, high-quality multimedia stimulus set produced by professional adult and child actors (ages 5 to 74) containing both visual and vocal emotional expressions. We obtained over 74,000 audiovisual recordings of a wide array of emotional and social behaviors, including the main universal emotions (happiness, sadness, anger, fear, disgust, and surprise), as well as more complex social expressions (pride, affection, sarcasm, jealousy, and shame). The actors generated a high quantity of technically superior, ecologically valid stimuli that were digitized, archived, and rated for accuracy and intensity of expressions. A subset of these facial and vocal expressions of emotion and social behavior were submitted for quantitative ratings to generate parameters for validity and discriminability. These stimuli are suitable for affective neuroscience-based psychometric tests, functional neuroimaging, and social cognitive rehabilitation programs. The purposes of this report are to describe the method of obtaining and validating this database and to make it accessible to the scientific community. We invite all those interested in participating in the use and validation of these stimuli to access them at


Social cognition Emotion Affect Stimuli Faces Audiovisual Recordings 


Author note

This work was supported by Grant Nos. NIH R01-MH060722 and NIH R01-MH084856. We thank our directors, Amy Dugas Brown and David M. O’Connor, and the Brain Behavior Laboratory staff, who edited the stimuli and provided Web programming support. The authors also recognize the work of Raymond P. Hill, who contributed significantly to the manuscript. Ray passed away in January 2013.

Supplementary material

13428_2014_464_MOESM1_ESM.pdf (64 kb)
ESM 1 (PDF 64 kb)


  1. Adams, R. B., & Kleck, R. E. (2003). Perceived gaze direction and the processing of facial displays of emotion. Psychological Science, 14, 644–647.CrossRefPubMedGoogle Scholar
  2. Adolphs, R., Sears, L., & Piven, J. (2001). Abnormal processing of social information from faces in autism. Journal of Cognitive Neuroscience, 13, 232–240.CrossRefPubMedGoogle Scholar
  3. Alba-Ferrara, L., Hausmann, M., Mitchell, R. L., & Weis, S. (2011). The neural correlates of emotional prosody comprehension: Disentangling simple from complex emotion. PLoS ONE, 6, e28701. doi: 10.1371/journal.pone.0028701 CrossRefPubMedCentralPubMedGoogle Scholar
  4. Alfimova, M. V., Abramova, L. I., Barhatova, A. I., Yumatova, P. E., Lyachenko, G. L., & Golimbet, V. E. (2009). Facial affect recognition deficit as a marker of genetic vulnerability to schizophrenia. Spanish Journal of Psychology, 12, 46–55.CrossRefPubMedGoogle Scholar
  5. Capps, L., Yirmiya, N., & Sigman, M. (1992). Understanding of simple and complex emotions in non-retarded children with autism. Journal of Child Psychology and Psychiatry, 33, 1169–1182.CrossRefPubMedGoogle Scholar
  6. De Silva, L. C., Miyasato, T., & Nakatsu, R. (1997). Facial emotion recognition using multi-modal information. Information, Communications and Signal Processing, 1, 397–401.CrossRefGoogle Scholar
  7. Edwards, J., Jackson, H. J., & Pattison, P. E. (2002). Emotion recognition via facial expression and affective prosody in schizophrenia: A methodological review. Clinical Psychology Review, 22, 789–832.CrossRefPubMedGoogle Scholar
  8. Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect. Palo Alto: Consulting Psychologists Press.Google Scholar
  9. Gur, R. C., Sara, R., Hagendoorn, M., Marom, O., Hughett, P., Turner, T., & Gur, R. E. (2002). A method for obtaining 3-dimensional facial expressions and its standardization for use in neurocognitive studies. Journal of Neuroscience Methods, 115, 137–143. doi: 10.1016/S0165-0270(02)00006-7 CrossRefPubMedGoogle Scholar
  10. Haskins, B., Shutty, J., & Kellogg, E. (1995). Affect processing in chronically psychotic patients: Development of a reliable assessment tool. Schizophrenia Research, 15, 291–297.CrossRefPubMedGoogle Scholar
  11. Heimberg, C., Gur, R. E., Erwin, R. J., Shtasel, D. L., & Gur, R. C. (1992). Facial emotion discrimination: III. Behavioral findings in schizophrenia. Psychiatry Research, 42, 253–265.CrossRefPubMedGoogle Scholar
  12. Hoekert, M., Kahn, R. S., Pijnenborg, M., & Aleman, A. (2007). Impaired recognition and expression of emotional prosody in schizophrenia: Review and meta-analysis. Clinical Psychology Review, 96, 135–145.Google Scholar
  13. Izard, C. E. (1971). The face of emotion. New York: Appleton-Century-Crofts.Google Scholar
  14. Kaulard, K., Cunningham, D. W., Bülthoff, H. H., & Wallraven, C. (2012). The MPI Facial Expression Database—A validated database of emotional and conversational facial expressions. PLoS ONE, 7, e32321. doi: 10.1371/journal.pone.0032321 CrossRefPubMedCentralPubMedGoogle Scholar
  15. Kerr, S. L., & Neale, J. M. (1993). Emotion perception in schizophrenia-specific deficit or further evidence of generalized poor performance. Journal of Abnormal Psychology, 102, 312–318.CrossRefPubMedGoogle Scholar
  16. Leentjens, A., Wielaert, S., van Harskamp, F., & Wilmink, F. (1998). Disturbances of affective prosody in patients with schizophrenia: A cross sectional study. Journal of Neurology, Neurosurgery and Psychiatry, 64, 375–378.CrossRefPubMedCentralPubMedGoogle Scholar
  17. Matsumoto, D., & Ekman, P. (1988). Japanese and Caucasian Facial Expressions of Emotion (JACFEE) and Neutral Focus (JACNeuF) [Slides]. San Francisco: Department of Psychology, San Francisco State University.Google Scholar
  18. Mazurski, E. J., & Bond, N. W. (1993). A new series of slides depicting facial expressions of affect: A comparison with the pictures of facial affect series. Australian Journal of Psychology, 45, 41–47.CrossRefGoogle Scholar
  19. McCann, J., & Peppe, S. (2003). Prosody in autism spectrum disorders: A critical review. International Journal of Language and Communication Disorders, 38, 325–350.CrossRefPubMedGoogle Scholar
  20. Morris, J. S., deBonis, M., & Dolan, R. J. (2002). Human amygdala responses to fearful eyes. NeuroImage, 17, 214–222.CrossRefPubMedGoogle Scholar
  21. Mower, E., Mataric, M. J., & Narayanan, S. (2009). Human perception of audio–visual synthetic character emotion expression in the presence of ambiguous and conflicting information. IEEE Transactions on Multimedia, 11, 843–855.CrossRefGoogle Scholar
  22. Reise, S. P., & Waller, N. G. (2009). Item response theory and clinical measurement. Annual Review of Clinical Psychology, 5, 27–48.CrossRefPubMedGoogle Scholar
  23. Russ, J. B., Gur, R. C., & Bilker, W. B. (2008). Validation of affective and neutral sentence content for prosodic testing. Behavior Research Methods, 40, 935–939. doi: 10.3758/BRM.40.4.935 CrossRefPubMedGoogle Scholar
  24. Savla, G. N., Vella, L., Armstrong, C. C., Penn, D. L., & Twamley, E. W. (2012). Deficits in domains of social cognition in schizophrenia: A meta-analysis of the empirical evidence. Schizophrenia Bulletin, 39, 979–992. doi: 10.1093/schbul/sbs080 CrossRefPubMedCentralPubMedGoogle Scholar
  25. Simon, D., Craig, K. D., Gosselin, F., Belin, P., & Rainville, P. (2008). Recognition and discrimination of prototypical dynamic expressions of pain and emotions. Pain, 1–2, 55–64.CrossRefGoogle Scholar
  26. Vuilleumier, P., Armony, J. L., Driver, J., & Dolan, R. J. (2003). Distinct spatial frequency sensitivities for processing faces and emotional expressions. Nature Neuroscience, 6, 624–631. doi: 10.1038/nn1057 CrossRefPubMedGoogle Scholar
  27. Whalen, P. J., Kagan, J., Cook, R. G., Davis, F. C., Kim, H., Polis, S., & Johnstone, T. (2004). Human amygdala responsivity to masked fearful eye whites. Science, 306, 2061. doi: 10.1126/science.1103617 CrossRefPubMedGoogle Scholar
  28. Williams, M. A., Morris, A. P., McGlone, F., Abbott, D. F., & Mattingley, J. B. (2004). Amygdala responses to fearful and happy facial expressions under conditions of binocular suppression. Journal of Neuroscience, 24, 2898–2904.CrossRefPubMedGoogle Scholar
  29. Winston, J. S., Vuilleumier, P., & Dolan, R. J. (2003). Effects of low-spatial frequency components of fearful faces on fusiform cortex activity. Current Biology, 13, 1824–1829.CrossRefPubMedGoogle Scholar

Copyright information

© Psychonomic Society, Inc. 2014

Authors and Affiliations

  • Michael K. Keutmann
    • 1
    • 4
    Email author
  • Samantha L. Moore
    • 2
  • Adam Savitt
    • 3
  • Ruben C. Gur
    • 3
  1. 1.University of IllinoisChicagoUSA
  2. 2.Temple UniversityPhiladelphiaUSA
  3. 3.University of PennsylvaniaPhiladelphiaUSA
  4. 4.Department of PsychologyUniversity of Illinois at ChicagoChicagoUSA

Personalised recommendations