Skip to main content
Log in

Perceptual biases for multimodal cues in chimpanzee (Pan troglodytes) affect recognition

  • Original Article
  • Published:
Animal Cognition Aims and scope Submit manuscript

Abstract

The ability of organisms to discriminate social signals, such as affective displays, using different sensory modalities is important for social communication. However, a major problem for understanding the evolution and integration of multimodal signals is determining how humans and animals attend to different sensory modalities, and these different modalities contribute to the perception and categorization of social signals. Using a matching-to-sample procedure, chimpanzees discriminated videos of conspecifics’ facial expressions that contained only auditory or only visual cues by selecting one of two facial expression photographs that matched the expression category represented by the sample. Other videos were edited to contain incongruent sensory cues, i.e., visual features of one expression but auditory features of another. In these cases, subjects were free to select the expression that matched either the auditory or visual modality, whichever was more salient for that expression type. Results showed that chimpanzees were able to discriminate facial expressions using only auditory or visual cues, and when these modalities were mixed. However, in these latter trials, depending on the expression category, clear preferences for either the visual or auditory modality emerged. Pant-hoots and play faces were discriminated preferentially using the auditory modality, while screams were discriminated preferentially using the visual modality. Therefore, depending on the type of expressive display, the auditory and visual modalities were differentially salient in ways that appear consistent with the ethological importance of that display’s social function.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2 a

Similar content being viewed by others

Notes

  1. Note that the play faces used in this study were always accompanied by laughter vocalizations, although this is not always the case in naturally occurring behavior.

References

  • Bauer HR, Philip M (1983) Facial and vocal individual recognition in the common chimpanzee. Psychol Rec 33:161–170

    Google Scholar 

  • Caron AJ, Caron RF, MacLean DJ (1988) Infant discrimination of naturalistic emotional expressions: the role of face and voice. Child Dev 59:604–616

    CAS  PubMed  Google Scholar 

  • Ekman P, Friesen WV, Ellsworth P (1972) Emotion in the human face: guidelines for research and an integration of findings. Pergamon, New York

    Google Scholar 

  • Ghazanfar AA, Logothetis NK (2003) Facial expressions linked to monkey calls. Nature 423:937–938

    Article  CAS  Google Scholar 

  • Gouzoules H, Gouzoules S (2000) Agonistic screams differ among four species of macaques: the significance of motivation-structural rules. Anim Behav 59:501–515

    Google Scholar 

  • Gouzoules S, Gouzoules H, Marler P (1984) Rhesus monkey (Macaca mulatta) screams: representational signalling in the recruitment of agonistic aid. Anim Behav 32:182–193

    Google Scholar 

  • Hashiya K (1999) Auditory-visual inter-modal recognition of conspecifics by a chimpanzee (Pan troglodytes). Primate Res 15:333–342

    Google Scholar 

  • Hashiya K, Kojima S (2001) Acquisition of auditory-visual inter-modal matching-to-sample by a chimpanzee (Pan troglodytes): Comparison with visual-visual intra-modal matching. Anim Cogn 4:231–239

    Article  Google Scholar 

  • Hooff JARAM van (1967) The facial displays of the Catarrhine monkeys and apes. In: Morris D (ed) Primate ethology. Aldine, Chicago, pp 7–68

  • Kuhl PK, Meltzoff AN (1982) The bi-modal perception of speech in infancy. Science 218:1138–1141

    CAS  PubMed  Google Scholar 

  • McGurk H, McDonald J (1976) Hearing lips and seeing voices. Nature 264:746–748

    CAS  PubMed  Google Scholar 

  • Meltzoff AN (1999) Origins of theory of mind, cognition and communication. J Commun Disord 32:251–269

    Article  CAS  PubMed  Google Scholar 

  • Mitani JC, Gros-Louis J, Macedonia JM (1996) Selection for acoustic individuality within the vocal repertoire of wild chimpanzees. Int J Primatol 17:569–581

    Google Scholar 

  • Morton ES (1982) Grading, discreteness, redundancy, and motivation-structural rules. In: Kroodsma DE, Miller EH, Ouellet H (eds) Acoustic communication in birds. Academic Press, New York, pp 183–212

  • Parr LA (2001) Cognitive and physiological markers of emotional awareness in chimpanzees. Anim Cogn 4:223–229

    Article  Google Scholar 

  • Parr L, Maestripieri D (2003) Nonvocal communication in nonhuman primates. In: Maestripieri D (ed) Primate psychology: the mind and behavior of human and nonhuman primates. Chicago University Press, Chicago, pp 324–358

  • Parr LA, Hopkins WD, de Waal FBM (1998) The perception of facial expressions in chimpanzees (Pan troglodytes). Evol Commun 2:1–23

    Google Scholar 

  • Parr LA, Winslow JT, Hopkins WD, de Waal FBM (2000) Recognizing facial cues: individual recognition in chimpanzees (Pan troglodytes) and rhesus monkeys (Macaca mulatta). J Comp Psychol 114:47–60

    Article  CAS  PubMed  Google Scholar 

  • Parr LA, Preuschoft S, de Waal FBM (2002) Afterword: research on facial emotion in chimpanzees, 75 years since Kohts. In: de Waal FBM (ed) Infant chimpanzee and human child. Oxford University Press, New York, pp 411–452

  • Partan S, Marler P (1999) Communication goes multimodal. Science 283:1272–1273

    CAS  PubMed  Google Scholar 

  • Rowe C (1999) Receiver psychology and the evolution of multicomponent signals. Anim Behav 58:921–931

    Google Scholar 

  • Rowe C (2001) Sound improves visual discrimination learning in avian predators. Proc R Soc Lond B 269:1353–1357

    Article  Google Scholar 

  • Siebert E, Parr LA (2002) Structural and contextual analysis of chimpanzee screams. Poster presented at the New York Academy of Sciences, Emotions Inside Out: 130 years after Darwin’s the Expression of the Emotions in Man and Animals, New York, N.Y., 16 November

  • Soken NH, Pick AD (1992) Intermodal perception of happy and angry expressive behaviors by seven-month-old infants. Child Dev 63:787–795

    CAS  PubMed  Google Scholar 

  • Vick SJ, Waller B, Smith-Pasqualini M, Parr LA, Bard KA (2003) The chimpanzee facial affect coding system (FACS): preliminary findings. 10th European conference on facial expression, measurement and meaning, Rimini, Italy, 18–20 September

  • Walker AS (1982) Intermodal perception of expressive behaviors by human infants. J Exp Child Psychol 33:514–535

    CAS  PubMed  Google Scholar 

  • Walker-Andrews AS (1986) Intermodal perception of expressive behaviors: relation of eye and voice? Dev Psychol 22:373–377

    Article  Google Scholar 

  • Walker-Andrews AS (1997) Infants’ perception of expressive behaviors: differentiation of multimodal information. Psychol Bull 121:437–456

    Article  CAS  PubMed  Google Scholar 

Download references

Acknowledgements

This investigation was supported by RR-00165 from the NIH/NCRR to the Yerkes Regional Primate Research Center. The Yerkes Primate Center is fully accredited by the American Association for Accreditation of Laboratory Animal Care. Thanks to the Living Links Center, Emory University, for the use of photographic material, and the animal care staff at the Yerkes Primate Center. Todd Preuss, Stuart Zola, and three anonymous reviewers provided helpful comments on earlier versions of this manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lisa A. Parr.

Electronic Supplementary Material

Video S1 An example of a congruent multi-modal trial. The sample shows a video of a pant-hoot expression on the left, and the right panel shows the two comparison images, a pant-hoot expression and a neutral portrait. The correct response is to select the pant-hoot expression on the left side.

AVI (2.6 MB)

JPG (30 KB)

AVI (2.6 MB)

sampleS2.wav

WAV (512 KB)

JPG (19 KB)

WAV (512 KB)

Video S3 An example of an intra-modal trial. The sample shows a video of a scream expression with no audio presented. The right panel shows the two comparison images, a scream expression and a neutral portrait. The correct response is to select the scream expression on the left side.

AVI (1.0 MB)

JPG (24 KB)

AVI (1.0 MB)

Video S4 An example of an incongruent multi-modal trial. The sample shows a video of a pant-hoot expression on the left. This is played with a vocalization incongruent with the sample video, in this case a scream vocalization. The right panel shows the two comparison images, a pant-hoot expression and a scream portrait. Either expression type is correct, the pant-hoot matches the visual modality of the sample and the scream expression matches the audio modality. Subjects are nondifferentially rewarded for choosing either comparison.

AVI (3.24 MB)

JPG (26 KB)

AVI (3.24 MB)

Rights and permissions

Reprints and permissions

About this article

Cite this article

Parr, L.A. Perceptual biases for multimodal cues in chimpanzee (Pan troglodytes) affect recognition. Anim Cogn 7, 171–178 (2004). https://doi.org/10.1007/s10071-004-0207-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10071-004-0207-1

Keywords

Navigation