Apparent motion enhances visual rhythm discrimination in infancy
Many studies have demonstrated that infants exhibit robust auditory rhythm discrimination, but research on infants’ perception of visual rhythm is limited. In particular, the role of motion in infants’ perception of visual rhythm remains unknown, despite the prevalence of motion cues in naturally occurring visual rhythms. In the present study, we examined the role of motion in 7-month-old infants’ discrimination of visual rhythms by comparing experimental conditions with apparent motion in the stimuli versus stationary rhythmic stimuli. Infants succeeded at discriminating visual rhythms only when the visual rhythm occurred with an apparent motion component. These results support the view that motion plays a role in infants’ perception of visual temporal information, consistent with the manner in which natural rhythms appear in the visual world.
KeywordsTemporal processing Apparent motion Visual rhythm Infants
Rhythm is typically thought of as the temporal organization in music—more specifically, the patterning of strong and weak beats in a musical piece. However, rhythm can be found in many domains outside of music, anywhere there is organization in the temporal dimension. The universality of rhythm—the pattern of events in time—stems from the fact that rhythm can be perceived in many different sensory modalities: audition, vision, touch, and proprioception. For example, the events of an auditory rhythm might consist of notes, chords, or syllables, while the events of a visual rhythm might consist of blinking lights or patterns of limb movements.
Despite the ubiquity of rhythm across the senses, most research on rhythm perception and its development has focused on audition. Infants can perceive and discriminate auditory rhythms very early in life. For example, 4-day-old infants use prosody to discriminate low-pass filtered versions of their native language from a foreign language, suggesting that auditory rhythm is perceived even before birth (Mehler et al., 1988). By 2–3 months of age, infants discriminate changes in auditory rhythms on the basis of differences in the interonset intervals (IOIs) that the rhythms comprise, as well as changes in the order of the same IOIs (Demany, McKenzie, & Vurpillot, 1977). These studies, and many others, have shown that infants are tuned to auditory rhythms very early in life (e.g., Winkler, Háden, Ladinig, Sziller, & Honing, 2009) and can discriminate auditory rhythms on the basis of fine details, and not just gross differences (for a recent review, see Trehub & Hannon, 2006).
To what extent can infants process and utilize rhythms in other modalities? One clue comes from research on multimodal perception. For example, Phillips-Silver and Trainor (2005) demonstrated how rhythm in one modality influences rhythm perception in another modality. Seven-month-old infants were briefly exposed to an ambiguous, nonaccented auditory rhythm while being bounced by a caregiver. Infants subsequently listened longer to the auditory rhythm, with accents matching the pattern in which they were bounced. The ambiguity of the auditory rhythm was resolved by rhythm experienced via proprioception.
Can infants likewise exploit rhythm in the visual modality? The visual world contains a great deal of information patterned in time that infants could utilize, in principle, to make predictions about future events. But do infants attend to visually conveyed rhythmic information? Additional experiments by Phillips-Silver and Trainor (2005) demonstrated that when the infants were blindfolded during the bouncing period of the experiment, they continued to prefer the auditory rhythm that matched the bouncing rhythm. However, infants who simply watched the experimenter bounce but were not themselves bounced failed to show an auditory preference. These infants did not use visual rhythmic information to interpret ambiguous auditory rhythms.
Nevertheless, other studies have indicated that infants do utilize visual rhythmic information observed in real-world bimodal events. Studies by Bahrick and Lickliter (2000, 2004), using the real-world bimodal event of a hammer hitting a block, demonstrated that infants do process visual rhythm, especially when correlated auditory cues are available during the learning phase. When audiovisual rhythmic events were presented to 5- and 8-month-olds, the infants successfully detected changes in the visual rhythms in a unimodal test. However, 8-month-olds, but not 5-month-olds, were able to perform the unimodal visual rhythm discrimination task without the presence of correlated auditory cues during habituation (Bahrick & Lickliter, 2000, 2004). These results suggest a developmental progression in infants’ ability to process visual rhythms over the first year of postnatal life.
Previous research on infants’ development of visual rhythm processing has thus largely focused on multimodal processing, especially the role played by auditory rhythm in directing infants’ attention to correlated rhythmic patterns in the visual modality. This research suggests that younger infants need the redundant rhythmic information across modalities in order to initially attend to, or recognize, a visual rhythmic pattern (e.g., Bahrick & Lickliter, 2000; Gibson, 1969). However, there is another important feature of visual rhythm that may affect infants’ ability to perceive and learn about visual rhythmic information. In the natural environment, visual rhythm is almost always accompanied by motion cues. Indeed, unlike auditory rhythm, which typically arises from a stationary source, we are accustomed to experiencing visual rhythm in the context of motion—gait, dance, repetitive actions such as tool use, and so forth.
Research on adult visual rhythm processing suggests that motion plays an important role in the perception of visual rhythms. For example, adults perform more poorly in tapping synchronization studies when synchronizing with visual rhythms (created with stationary flashing lights) than when synchronizing with auditory rhythms (e.g., Kolers & Brewster, 1985; Patel, Iversen, Chen, & Repp, 2005; Repp & Penel, 2002, 2004). However, Iversen, Patel, Nicodemus, and Emmorey (2009) demonstrated a marked improvement in adults’ tapping synchronization when the visual rhythm was instantiated as a continuously moving ball, rhythmically bouncing up and down. Additionally, Hove, Spivey, and Krumhansl (2010) found an increase in accuracy for adults’ tapping synchrony using visual stimuli with apparent motion in the same direction as the tapping motion.
The presence of motion cues thus facilitates adult visual rhythmic processing, due to experience with visual motion in the real world and/or to inherent visual processing advantages for moving stimuli (e.g., more attention capturing). Do motion cues similarly facilitate infants’ perception of visual rhythms? To test this hypothesis, we familiarized 7-month-old infants with a silent visual rhythmic pattern and manipulated the presence/absence of motion cues. We used apparent motion in this study to control stimulus properties across the two conditions. We hypothesized that, like adults, infants would be more successful at visual rhythmic discrimination when the display included apparent motion cues. However, if motion does not facilitate visual rhythmic processing in infancy, there should be no difference in infants’ test performance in the motion versus stationary rhythm conditions.
Thirty-two 7-month-old infants were assigned to either the stationary (M = 7.5 months, SD = 0.31, range = 7.1–7.9) or motion (M = 7.7 months, SD = 0.28, range = 7.1–8.0) visual rhythm condition. An additional 10 infants were tested but were excluded from the analyses for the following reasons: parental interference (1), crying (1), exceeding maximum looking-time criteria (3), failure to attend (2), and equipment error (3). All participants were full term and had no history of hearing or vision problems or current ear infections, according to parental report.
The visual rhythms were constructed from brightly colored shapes. In the motion condition, the shapes flashed from left to right across six locations, creating apparent motion across the screen. The left edge of the shape in the first location was 1 cm from the left screen edge; the shape locations were evenly spaced, with a distance of 5.8 cm from the center of one shape to the center of the next shape. The right edge of the shape in the sixth location was 0.7 cm from the right edge of the screen. All shapes where centered vertically, with 12 cm of white screen above and below. In the stationary condition, the shapes flashed in a single central location at the same vertical location as the shapes in the motion condition but horizontally in the exact center of the screen. A blank frame was used to start and end each IOI event in both conditions. The shape then appeared full size and stayed illuminated on the screen for its prescribed duration. The blank frames gave the appearance of a single shape flashing across the screen in the specified rhythm for the motion condition, or flashing with the specified rhythm in the center of the screen in the stationary condition (see Supplementary Materials). Piloting with adults showed no reports of perception of afterimages in either condition. During familiarization, three shapes (circle, pentagon, and triangle) were presented in four different colors (blue, green, red, and yellow). The largest dimension (e.g., height, width, or diameter) of each shape was 5.72 cm; thus, each shape subtended a visual angle of approximately 4.03°.The variations in the shapes and colors were randomized across repetitions of the rhythm. Within one repetition of the rhythm, however, the color and shape remained the same to encourage the perception of a single shape flashing rhythmically.
For the test trials, the rhythmic displays were identical to those in the familiarization period, except that all test rhythms were constructed using purple squares. In the motion condition, the purple squares flashed left to right across the six locations on the screen, while in the stationary condition, the squares flashed in the center of the screen. Sequences that were familiar for half of the infants in each condition were novel for the other half, and vice versa.
Infants were seated on their parents’ laps in a sound-attenuated booth, facing a video monitor. Parents listened to unrelated masking music over closed-ear headphones and were instructed not to speak to, bounce, or pat their infants during the study. An attention-getter (a silent movie of a pinwheel spinning at a constant rate) was used to attract infants’ attention back to the screen at the start and end of the familiarization period and between test trials.
Infants were tested using a preferential-looking paradigm with an initial fixed familiarization period. The visual materials were presented on a monitor directly in front of the infant. The experiment began with a familiarization period (48 sec) consisting of 15 repetitions of the rhythm. There was an 800-ms blank screen (the same IOI as the longest event in the rhythm) between each repetition of the rhythm. An infant-controlled training block, consisting of two additional trials of the familiarized rhythm, followed familiarization. The training block was included to teach the infants that when they looked away from the central screen, the stimuli would stop and change to the spinning pinwheel, unlike the familiarization period, which was not infant controlled. The test followed immediately and consisted of two blocks of infant-controlled test trials. Each block included one familiar trial and one novel trial (the other rhythm in the set). The order of the presentation of familiar and novel trials was counterbalanced across participants and was the same in the two blocks.
Looking times were recorded online by a trained coder observing the infant outside the booth on a closed circuit video monitor. The coder was blind to the stimuli presented on each trial. The software Habit X (Cohen, Atkinson, & Chaput, 2004) was used to present the stimuli and record the looking times. During the fixed familiarization, the infants’ looking behavior did not affect the presentation of the stimuli. During the test, the infants’ behavior controlled the length of the trial. If the infant looked away from the central monitor for more than 1 s, the test trial ended. Alternatively, if the infant did not look away at all, the trial continued for a maximum of 15 s, followed by the attention-getter. If an infant attended for the maximum time on at least four of the six infant-controlled trials, the infant’s data were excluded from the analyses (3 infants).
First, we examined the data to determine whether there were any effects due to the order of presentation of test stimuli or the two different rhythm sets used. An independent samples t-test on the two trial orders (familiar first or novel first), using the difference score for the looking times (familiar-novel), revealed no significant difference between the orders, t(30) = 0.50, p = .62. Likewise, an independent samples t-test comparing the two rhythm sets revealed no significant difference between the rhythm sets, t(30) = 0.68, p = .50. The data were thus collapsed across orders and rhythm sets in the subsequent analyses.
Our results demonstrate that 7-month-old infants can discriminate relative changes in visual rhythms when the visual rhythms are presented with an apparent motion cue. In the absence of apparent motion, the infants did not demonstrate an ability to discriminate changes in the test rhythms; they did not look differentially to either trial type. These results suggest that apparent motion cues facilitate infants’ encoding of visual rhythm, just as apparent motion cues and constant motion cues aided adults’ tapping synchrony to visual rhythms in Hove et al. (2010) and Iversen et al. (2009), respectively.
Interestingly, the difference in infants’ performance across conditions was observed despite the presence of multiple design features intended to direct infants’ attention to the rhythmic features of the stimuli. The familiarization materials were constructed such that rhythm was a constant property amid changes to multiple other dimensions of the stimuli (shape and color). Despite familiarization materials that maximally highlighted rhythm as a salient property of these stimuli, only the infants in the motion condition detected the change in rhythm during the test. Furthermore, the only change in the test trials was in the order of the IOIs within the rhythms; the overall length of the rhythmic patterns, the color and shape, and the IOIs used were the same across test items. It is thus particularly noteworthy that the infants in the stationary condition failed to discriminate the visual rhythms.
What is it about the apparent motion cues that facilitated performance in this task? One possibility is that the presence of apparent motion may cause the event onsets to be more attention grabbing for the infants, helping them to encode the temporal onsets more accurately and, thus, improving rhythmic discrimination. The onsets in the two conditions were programmed to be identical in an attempt to equate their salience across conditions. However, the addition of a new location perfectly correlated with the event onsets could be particularly attention grabbing. An alternative explanation is that the infants did not perceive the stationary condition in the same manner as the apparent motion condition, failing to detect the rhythm at all. Further research is needed to understand the role of stimulus onsets in the absence of motion cues when visual rhythm is perceived.
The limited research on the role of motion in visual rhythm processing may seem surprising, especially given that all biological motion has rhythm. However, when the theoretical context of adult research on visual rhythm perception and production is examined, this lack of attention to motion makes more sense. In earlier work, researchers hypothesized that the auditory modality was superior to the visual modality for rhythm processing and suggested that this difference was rooted in the different processing skills across the two modalities, with the auditory modality favoring temporal processing and the visual modality favoring spatial processing (e.g., Glenberg & Swanson, 1986; Welch, 1999; Welch &Warren, 1980). Thus, a major focus of rhythm research has been the superiority of the auditory system over the visual system for processing temporal information (e.g., Collier & Logan, 2000; Glenberg & Jona, 1991; Glenberg, Mann, Altman, Forman, & Procise, 1989; Guttman, Gilroy, & Blake, 2005; Kolers & Brewster, 1985; Patel et al., 2005, Repp & Penel, 2002, 2004). As a result of this research goal, rhythmic stimuli are often inadvertently designed to favor the auditory system. Stimuli are initially created in the auditory modality and stand on their own as auditory rhythms (rhythmic tones), whereas the visual rhythms are created to be as comparable as possible to the auditory stimuli (stationary flashing lights). Because the auditory rhythmic stimuli lack motion cues, the visual rhythmic stimuli do as well. This design choice may have incidentally led to poorer performance on visual rhythm tasks because of the use of unnatural visual rhythms. The present study, as well as the two adult tapping synchrony studies (Hove et al., 2010; Iversen et al., 2009), demonstrate that inclusion of a motion cue, which is typical of real-world visual rhythms, markedly improves rhythm perception and production for infants and adults, respectively.
Considering how ubiquitous visual rhythms are in the environment, it is possible that infants use the structure found in visual rhythms to aid their decoding of their visual environment. The present research takes an important first step in demonstrating that apparent motion facilitates infants’ visual rhythm perception. Future research will reveal the extent to which infants can use visual rhythm to organize events and make predictions about their world.
This work was supported by NIH Grant RO1-HD37466 and a James S. McDonnell Foundation Scholar Award to JRS and by NIH Grant P30 HD03352 to the Waisman Center, University of Wisconsin–Madison. We thank the members of the Infant Learning Lab for their assistance in conducting this research; Erin Jonaitis, Bruno Repp, and two anonymous reviewers for helpful comments on a previous version of the manuscript; and the infants and their families for their participation.
(MOV 11566 kb)
- Cohen, L. B., Atkinson, D. J., & Chaput, H. H. (2004). Habit X: A new program for obtaining and organizing data in infant perception and cognition studies (Version 1.0) [Computer software]. Austin: University of Texas.Google Scholar
- Gibson, E. J. (1969). Principles of perceptual learning and development. New York: Appleton-Century-Crofts.Google Scholar
- Iversen, J., Patel, A., Nicodemus, B., & Emmorey, K. (2009). Comparing synchronization to auditory and visual rhythms in hearing and deaf individuals. Paper presented at the Society for Music Perception and Cognition 2009 Biennial Conference, Indianapolis, IN.Google Scholar
- Welch, R. B. (1999). Meaning, attention, and the “unity assumption” in the intersensory bias of spatial and temporal perceptions. In G. Aschersleben, T. Bachmann, & J. Musseler (Eds.), Cognitive contributions to the perception of spatial and temporal events (pp. 371–387). Amsterdam: Elsevier.CrossRefGoogle Scholar