Animal Cognition

, Volume 7, Issue 3, pp 171–178 | Cite as

Perceptual biases for multimodal cues in chimpanzee (Pan troglodytes) affect recognition

  • Lisa A. ParrEmail author
Original Article


The ability of organisms to discriminate social signals, such as affective displays, using different sensory modalities is important for social communication. However, a major problem for understanding the evolution and integration of multimodal signals is determining how humans and animals attend to different sensory modalities, and these different modalities contribute to the perception and categorization of social signals. Using a matching-to-sample procedure, chimpanzees discriminated videos of conspecifics’ facial expressions that contained only auditory or only visual cues by selecting one of two facial expression photographs that matched the expression category represented by the sample. Other videos were edited to contain incongruent sensory cues, i.e., visual features of one expression but auditory features of another. In these cases, subjects were free to select the expression that matched either the auditory or visual modality, whichever was more salient for that expression type. Results showed that chimpanzees were able to discriminate facial expressions using only auditory or visual cues, and when these modalities were mixed. However, in these latter trials, depending on the expression category, clear preferences for either the visual or auditory modality emerged. Pant-hoots and play faces were discriminated preferentially using the auditory modality, while screams were discriminated preferentially using the visual modality. Therefore, depending on the type of expressive display, the auditory and visual modalities were differentially salient in ways that appear consistent with the ethological importance of that display’s social function.


Communication Multimodal signals Facial expressions Chimpanzee 



This investigation was supported by RR-00165 from the NIH/NCRR to the Yerkes Regional Primate Research Center. The Yerkes Primate Center is fully accredited by the American Association for Accreditation of Laboratory Animal Care. Thanks to the Living Links Center, Emory University, for the use of photographic material, and the animal care staff at the Yerkes Primate Center. Todd Preuss, Stuart Zola, and three anonymous reviewers provided helpful comments on earlier versions of this manuscript.

Supplementary material

Video S1 An example of a congruent multi-modal trial. The sample shows a video of a pant-hoot expression on the left, and the right panel shows the two comparison images, a pant-hoot expression and a neutral portrait. The correct response is to select the pant-hoot expression on the left side.

Open image in new window

JPG (30 KB)

AVI (2.6 MB)

Sound Clip S2 An example of a cross-modal trial. The sample is a scream vocalization. No visual image is present. The right panel shows the two comparison images, a scream expression and a neutral portrait. The correct response is to select the scream expression on the right side.

Open image in new window

JPG (19 KB)

sampleS2.wav (501 kb)
WAV (512 KB)

Video S3 An example of an intra-modal trial. The sample shows a video of a scream expression with no audio presented. The right panel shows the two comparison images, a scream expression and a neutral portrait. The correct response is to select the scream expression on the left side.

Open image in new window

JPG (24 KB)

AVI (1.0 MB)

Video S4 An example of an incongruent multi-modal trial. The sample shows a video of a pant-hoot expression on the left. This is played with a vocalization incongruent with the sample video, in this case a scream vocalization. The right panel shows the two comparison images, a pant-hoot expression and a scream portrait. Either expression type is correct, the pant-hoot matches the visual modality of the sample and the scream expression matches the audio modality. Subjects are nondifferentially rewarded for choosing either comparison.

Open image in new window

JPG (26 KB)

AVI (3.24 MB)


  1. Bauer HR, Philip M (1983) Facial and vocal individual recognition in the common chimpanzee. Psychol Rec 33:161–170Google Scholar
  2. Caron AJ, Caron RF, MacLean DJ (1988) Infant discrimination of naturalistic emotional expressions: the role of face and voice. Child Dev 59:604–616PubMedGoogle Scholar
  3. Ekman P, Friesen WV, Ellsworth P (1972) Emotion in the human face: guidelines for research and an integration of findings. Pergamon, New YorkGoogle Scholar
  4. Ghazanfar AA, Logothetis NK (2003) Facial expressions linked to monkey calls. Nature 423:937–938CrossRefGoogle Scholar
  5. Gouzoules H, Gouzoules S (2000) Agonistic screams differ among four species of macaques: the significance of motivation-structural rules. Anim Behav 59:501–515Google Scholar
  6. Gouzoules S, Gouzoules H, Marler P (1984) Rhesus monkey (Macaca mulatta) screams: representational signalling in the recruitment of agonistic aid. Anim Behav 32:182–193Google Scholar
  7. Hashiya K (1999) Auditory-visual inter-modal recognition of conspecifics by a chimpanzee (Pan troglodytes). Primate Res 15:333–342Google Scholar
  8. Hashiya K, Kojima S (2001) Acquisition of auditory-visual inter-modal matching-to-sample by a chimpanzee (Pan troglodytes): Comparison with visual-visual intra-modal matching. Anim Cogn 4:231–239CrossRefGoogle Scholar
  9. Hooff JARAM van (1967) The facial displays of the Catarrhine monkeys and apes. In: Morris D (ed) Primate ethology. Aldine, Chicago, pp 7–68Google Scholar
  10. Kuhl PK, Meltzoff AN (1982) The bi-modal perception of speech in infancy. Science 218:1138–1141PubMedGoogle Scholar
  11. McGurk H, McDonald J (1976) Hearing lips and seeing voices. Nature 264:746–748PubMedGoogle Scholar
  12. Meltzoff AN (1999) Origins of theory of mind, cognition and communication. J Commun Disord 32:251–269CrossRefPubMedGoogle Scholar
  13. Mitani JC, Gros-Louis J, Macedonia JM (1996) Selection for acoustic individuality within the vocal repertoire of wild chimpanzees. Int J Primatol 17:569–581Google Scholar
  14. Morton ES (1982) Grading, discreteness, redundancy, and motivation-structural rules. In: Kroodsma DE, Miller EH, Ouellet H (eds) Acoustic communication in birds. Academic Press, New York, pp 183–212Google Scholar
  15. Parr LA (2001) Cognitive and physiological markers of emotional awareness in chimpanzees. Anim Cogn 4:223–229CrossRefGoogle Scholar
  16. Parr L, Maestripieri D (2003) Nonvocal communication in nonhuman primates. In: Maestripieri D (ed) Primate psychology: the mind and behavior of human and nonhuman primates. Chicago University Press, Chicago, pp 324–358Google Scholar
  17. Parr LA, Hopkins WD, de Waal FBM (1998) The perception of facial expressions in chimpanzees (Pan troglodytes). Evol Commun 2:1–23Google Scholar
  18. Parr LA, Winslow JT, Hopkins WD, de Waal FBM (2000) Recognizing facial cues: individual recognition in chimpanzees (Pan troglodytes) and rhesus monkeys (Macaca mulatta). J Comp Psychol 114:47–60CrossRefPubMedGoogle Scholar
  19. Parr LA, Preuschoft S, de Waal FBM (2002) Afterword: research on facial emotion in chimpanzees, 75 years since Kohts. In: de Waal FBM (ed) Infant chimpanzee and human child. Oxford University Press, New York, pp 411–452Google Scholar
  20. Partan S, Marler P (1999) Communication goes multimodal. Science 283:1272–1273PubMedGoogle Scholar
  21. Rowe C (1999) Receiver psychology and the evolution of multicomponent signals. Anim Behav 58:921–931Google Scholar
  22. Rowe C (2001) Sound improves visual discrimination learning in avian predators. Proc R Soc Lond B 269:1353–1357CrossRefGoogle Scholar
  23. Siebert E, Parr LA (2002) Structural and contextual analysis of chimpanzee screams. Poster presented at the New York Academy of Sciences, Emotions Inside Out: 130 years after Darwin’s the Expression of the Emotions in Man and Animals, New York, N.Y., 16 NovemberGoogle Scholar
  24. Soken NH, Pick AD (1992) Intermodal perception of happy and angry expressive behaviors by seven-month-old infants. Child Dev 63:787–795PubMedGoogle Scholar
  25. Vick SJ, Waller B, Smith-Pasqualini M, Parr LA, Bard KA (2003) The chimpanzee facial affect coding system (FACS): preliminary findings. 10th European conference on facial expression, measurement and meaning, Rimini, Italy, 18–20 SeptemberGoogle Scholar
  26. Walker AS (1982) Intermodal perception of expressive behaviors by human infants. J Exp Child Psychol 33:514–535PubMedGoogle Scholar
  27. Walker-Andrews AS (1986) Intermodal perception of expressive behaviors: relation of eye and voice? Dev Psychol 22:373–377CrossRefGoogle Scholar
  28. Walker-Andrews AS (1997) Infants’ perception of expressive behaviors: differentiation of multimodal information. Psychol Bull 121:437–456CrossRefPubMedGoogle Scholar

Copyright information

© Springer-Verlag 2004

Authors and Affiliations

  1. 1.Division of Psychobiology, Yerkes National Primate Research CenterEmory UniversityAtlantaUSA

Personalised recommendations