Perceptual biases for multimodal cues in chimpanzee (Pan troglodytes) affect recognition
- 437 Downloads
The ability of organisms to discriminate social signals, such as affective displays, using different sensory modalities is important for social communication. However, a major problem for understanding the evolution and integration of multimodal signals is determining how humans and animals attend to different sensory modalities, and these different modalities contribute to the perception and categorization of social signals. Using a matching-to-sample procedure, chimpanzees discriminated videos of conspecifics’ facial expressions that contained only auditory or only visual cues by selecting one of two facial expression photographs that matched the expression category represented by the sample. Other videos were edited to contain incongruent sensory cues, i.e., visual features of one expression but auditory features of another. In these cases, subjects were free to select the expression that matched either the auditory or visual modality, whichever was more salient for that expression type. Results showed that chimpanzees were able to discriminate facial expressions using only auditory or visual cues, and when these modalities were mixed. However, in these latter trials, depending on the expression category, clear preferences for either the visual or auditory modality emerged. Pant-hoots and play faces were discriminated preferentially using the auditory modality, while screams were discriminated preferentially using the visual modality. Therefore, depending on the type of expressive display, the auditory and visual modalities were differentially salient in ways that appear consistent with the ethological importance of that display’s social function.
KeywordsCommunication Multimodal signals Facial expressions Chimpanzee
This investigation was supported by RR-00165 from the NIH/NCRR to the Yerkes Regional Primate Research Center. The Yerkes Primate Center is fully accredited by the American Association for Accreditation of Laboratory Animal Care. Thanks to the Living Links Center, Emory University, for the use of photographic material, and the animal care staff at the Yerkes Primate Center. Todd Preuss, Stuart Zola, and three anonymous reviewers provided helpful comments on earlier versions of this manuscript.
Video S1 An example of a congruent multi-modal trial. The sample shows a video of a pant-hoot expression on the left, and the right panel shows the two comparison images, a pant-hoot expression and a neutral portrait. The correct response is to select the pant-hoot expression on the left side.
JPG (30 KB)
AVI (2.6 MB)
Sound Clip S2 An example of a cross-modal trial. The sample is a scream vocalization. No visual image is present. The right panel shows the two comparison images, a scream expression and a neutral portrait. The correct response is to select the scream expression on the right side.
JPG (19 KB)
Video S3 An example of an intra-modal trial. The sample shows a video of a scream expression with no audio presented. The right panel shows the two comparison images, a scream expression and a neutral portrait. The correct response is to select the scream expression on the left side.
JPG (24 KB)
AVI (1.0 MB)
Video S4 An example of an incongruent multi-modal trial. The sample shows a video of a pant-hoot expression on the left. This is played with a vocalization incongruent with the sample video, in this case a scream vocalization. The right panel shows the two comparison images, a pant-hoot expression and a scream portrait. Either expression type is correct, the pant-hoot matches the visual modality of the sample and the scream expression matches the audio modality. Subjects are nondifferentially rewarded for choosing either comparison.
JPG (26 KB)
AVI (3.24 MB)
- Bauer HR, Philip M (1983) Facial and vocal individual recognition in the common chimpanzee. Psychol Rec 33:161–170Google Scholar
- Ekman P, Friesen WV, Ellsworth P (1972) Emotion in the human face: guidelines for research and an integration of findings. Pergamon, New YorkGoogle Scholar
- Gouzoules H, Gouzoules S (2000) Agonistic screams differ among four species of macaques: the significance of motivation-structural rules. Anim Behav 59:501–515Google Scholar
- Gouzoules S, Gouzoules H, Marler P (1984) Rhesus monkey (Macaca mulatta) screams: representational signalling in the recruitment of agonistic aid. Anim Behav 32:182–193Google Scholar
- Hashiya K (1999) Auditory-visual inter-modal recognition of conspecifics by a chimpanzee (Pan troglodytes). Primate Res 15:333–342Google Scholar
- Hooff JARAM van (1967) The facial displays of the Catarrhine monkeys and apes. In: Morris D (ed) Primate ethology. Aldine, Chicago, pp 7–68Google Scholar
- Mitani JC, Gros-Louis J, Macedonia JM (1996) Selection for acoustic individuality within the vocal repertoire of wild chimpanzees. Int J Primatol 17:569–581Google Scholar
- Morton ES (1982) Grading, discreteness, redundancy, and motivation-structural rules. In: Kroodsma DE, Miller EH, Ouellet H (eds) Acoustic communication in birds. Academic Press, New York, pp 183–212Google Scholar
- Parr L, Maestripieri D (2003) Nonvocal communication in nonhuman primates. In: Maestripieri D (ed) Primate psychology: the mind and behavior of human and nonhuman primates. Chicago University Press, Chicago, pp 324–358Google Scholar
- Parr LA, Hopkins WD, de Waal FBM (1998) The perception of facial expressions in chimpanzees (Pan troglodytes). Evol Commun 2:1–23Google Scholar
- Parr LA, Preuschoft S, de Waal FBM (2002) Afterword: research on facial emotion in chimpanzees, 75 years since Kohts. In: de Waal FBM (ed) Infant chimpanzee and human child. Oxford University Press, New York, pp 411–452Google Scholar
- Rowe C (1999) Receiver psychology and the evolution of multicomponent signals. Anim Behav 58:921–931Google Scholar
- Siebert E, Parr LA (2002) Structural and contextual analysis of chimpanzee screams. Poster presented at the New York Academy of Sciences, Emotions Inside Out: 130 years after Darwin’s the Expression of the Emotions in Man and Animals, New York, N.Y., 16 NovemberGoogle Scholar
- Vick SJ, Waller B, Smith-Pasqualini M, Parr LA, Bard KA (2003) The chimpanzee facial affect coding system (FACS): preliminary findings. 10th European conference on facial expression, measurement and meaning, Rimini, Italy, 18–20 SeptemberGoogle Scholar