Right away: A late, right-lateralized category effect complements an early, left-lateralized category effect in visual search
According to the Sapir–Whorf hypothesis, learned semantic categories can influence early perceptual processes. A central finding in support of this view is the lateralized category effect—namely, the finding that categorically different colors (e.g., blue and green hues) can be discriminated faster than colors within the same color category (e.g., different hues of green), especially when they are presented in the right visual field. Because the right visual field projects to the left hemisphere, this finding has been popularly couched in terms of the left-lateralization of language. However, other studies have reported bilateral category effects, which has led some researchers to question the linguistic origins of the effect. Here we examined the time course of lateralized and bilateral category effects in the classical visual search paradigm by means of eyetracking and RT distribution analyses. Our results show a bilateral category effect in the manual responses, which is combined of an early, left-lateralized category effect and a later, right-lateralized category effect. The newly discovered late, right-lateralized category effect occurred only when observers had difficulty locating the target, indicating a specialization of the right hemisphere to find categorically different targets after an initial error. The finding that early and late stages of visual search show different lateralized category effects can explain a wide range of previously discrepant findings.
KeywordsSapir–Whorf hypothesis Whorfian effect Visual search Category effect Categorical perception Eye movements
Categorically distinct colors are easier to find than categorically identical colors (e.g., Bornstein & Korda, 1984; Harnad, 1987, 2003). That is, an odd-man-out color target in visual search will be found faster when it is categorically different from the distractors (e.g., blue amongst green) than when the target and distractor share a color category (e.g., different hues of green; Daoutis, Pilling, & Davies, 2006). This occurs even when distance in color space is controlled, which theoretically should render the targets and distractors equally discriminable on a perceptual level (e.g., Gilbert, Regier, Kay, & Ivry, 2006; Wolfe, 1998). Several studies have reported that such color category effects are lateralized, in that they are only present (Gilbert et al., 2006) or are more pronounced (Drivonikou et al., 2007) when the target is presented in the right visual field (RVF) rather than the left visual field (LVF). Since the RVF projects predominantly to the left hemisphere (due to contralateral visual pathways), the finding of (hemispherically) left-lateralized color category effects has been linked to the lateralization of language, which is also left-lateralized in the majority of the population (Kay & Kempton, 1984; Frost et al., 1999). For this reason, the left-lateralized color category effect has often been interpreted as support for the Sapir–Whorf hypothesis, that categories prescribed by language can influence perceptual processes (e.g., Drivonikou et al., 2007; Gilbert, Regier, Kay, & Ivry, 2006, 2008).
Still, a number of recent studies have questioned the linguistic origins of this effect. First, similar left-lateralized category effects have been obtained using unlabeled categories (Holmes & Wolff, 2012; see also Holmes & Regier, 2017). Second, several studies have failed to find left-lateralized category effects, but found only bilateral effects (e.g., Witzel & Gegenfurtner, 2013; see also Brown, Lindsey, & Guckes, 2011; Fonteneau & Davidoff, 2007; Holmes, Franklin, Clifford, & Davies, 2009), giving rise to nonlinguistic theories of these effects.
The aim of the present study was to obtain fine-grained measurements of the time course of lateralized and bilateral category effects in visual search, in order to assess whether the discrepant results can potentially be explained with reference to differences in the time courses of these effects (regardless of the possible linguistic or nonlinguistic origins of the category effect).
What factors could explain the discrepant findings of bilateral versus lateralized category effects?
At a very basic level, a bilateral category effect could be due to a perceptual confound—for example, when the across-category targets1 are perceptually more dissimilar, and therefore pop out more strongly, than the within-category targets (e.g., green targets of different hues; Duncan & Humphreys, 1989; Witzel & Gegenfurtner, 2013). Such a perceptual confound would produce an “artificial” bilateral category advantage (but not a lateralized category effect). Alternatively, the presence of a bilateral category effect (and/or the absence of lateralized category effects) could be due to eye movements: When observers are allowed to freely move their eyes, objects presented on the right side of the display will not necessarily project to the left hemisphere, and vice versa, which could produce bilateral category effects (even if the category effect is left-lateralized). These explanations, however, are decidedly unsatisfying, given that bilateral category effects have been reported even in studies that have carefully controlled for perceptual confounds and/or eye movement artifacts (see, e.g., Witzel & Gegenfurtner, 2013).
A potentially more compelling explanation of the frequent failure to obtain lateralized category effects is that left-lateralized category effects emerge only transiently, at an early stage of visual search (Roberson, Pak, & Hanley, 2008; see also Regier & Kay, 2009). According to this explanation, the commonly used measurements of manual response times (RT) may have failed to show the lateralized category effect because mean RTs probe visual search at a very late stage (i.e., after visual selection, target identification, and response selection and execution), and are therefore not very sensitive to effects occurring only at an early stage of visual search (Becker, 2010a, b).
In line with this explanation, several studies have shown that the lateralized category effect indeed emerges at an early stage of visual search. For instance, when observers are asked to make a fast eye movement to a color target, a left-lateralized category effect is found in the saccadic latencies of the first eye movements (i.e., the time needed to visually select the target), indicating that early processes in visual search can already show lateralized category effects (Al-Rasheed, Franklin, Drivonikou, & Davies, 2014; but see Brown et al., 2011). Similarly, in an electroencephalographic (EEG) study, Liu et al. (2009) found a left-lateralized category effect in the N2pc (Liu et al., 2009), an event-related potential that indexes the time required to covertly attend to the target (Luck & Hillyard, 1994). Interestingly, in the study by Liu et al., the mean response times (RTs) simultaneously failed to show a lateralized category effect and revealed instead a bilateral category effect. This is a particularly intriguing result, because it suggests that an early, lateralized category advantage can transform into a bilateral category effect at later stages of visual search. Yet, the factors mediating the occurrence of bilateral versus lateralized effects remain unclear.
Category effects were examined in two blocked conditions: In the saccade task, observers were instructed to make a fast and precise eye movement to the target, and time-course information was obtained by analyzing eye movement parameters at different stages of visual search. Following standard procedures, we tapped into early processes of visual search by analyzing the accuracy and latency of the first eye movements, and examined later processes (e.g., including distractor rejection) by probing the time required to select the target (for similar approaches, see Becker, 2010a, b; Becker, Harris, Venini, & Retell, 2014; Zhao et al., 2012). To examine whether the results would generalize to conditions in which eye movements were not allowed, observers also completed a fixation task, in which they had to remain fixated on the central fixation cross during the entire trial. In the fixation task, coarse-grained time-course information was obtained by analyzing category effects separately for different portions of the RT distribution (fast vs. slow RTs).
Sixteen right handed volunteers (nine female, seven male; M = 24.31 years, SD = 2.85) from the University of Queensland, Australia, participated for AUD10. All participants gave informed consent, had normal or corrected-to-normal vision, and were reimbursed with $10 (AUD) for participating in the experiment.
The apparatus consisted of a PC with a 2.4-GHz Intel Core 2 Duo CPU running the Presentation software (Neurobehavioral Systems), a 21-in. color LCD monitor (BenQ FP92V; resolution: 1,280 × 1,024; refresh rate: 75 Hz), a video-based infrared eyetracker (EyeLink 1000, SR Research Ltd., Ontario, Canada), and a standard USB optical mouse.
The search display consisted of 12 colored squares (1.8° × 1.8°) that were presented equidistantly (5.5°) from a central black fixation cross (0.3° × 0.3°) against a light gray background (see Fig. 1). The colors of the search stimuli were blue (CIE x,y = .265, .273), blue–green (CIE x,y = .243, .312), green–blue (CIE x,y = .252, .338), and green (CIE x,y = .266, .372) and were matched for luminance using a CRS ColorCal colorimeter (63.9–64.6 cd/m2; mean: 64.2 cd/m2).
The visual search task consisted of two blocked conditions (saccade task, fixation task), presented in random order. Within each block, the target position (left/right hemifield) was chosen randomly. Target category was controlled: The colors for the target and distractor were always chosen in pairs of the most similar colors, resulting in three possible color pairs, of which blue/blue–green and green/green–blue were the within-category color pairs, and green–blue/blue–green was the across-category color pair. The target–distractor pairs were presented equally often, for a total of 300 trials per block.
Prior to the experiment, observers were asked to label the colors as green or blue as the colors were presented individually on the screen. Responses were recorded manually by the experimenter. To ensure stable and accurate eyetracking, observers were calibrated (9-point calibration) and a fixation control was implemented before each trial. This control was particularly important, to ensure that participants did not shift their head or eyes prior to the trial in such a way that would result in the stimuli being projected to incorrect visual fields. A fixation cross was presented, and after participants had continuously fixated within 1.3° of the central cross (for at least 500 ms, within a time window of 2,000 ms) the search display was presented. If appropriate fixation was not registered by the eyetracker, the participant was calibrated anew. The search display was presented until the participant made a manual response to indicate whether the target was on the left or the right, and the search display was followed immediately by a feedback display consisting of the words “Correct” or “Wrong” (in 12-pt Arial font presented for 750 ms, followed by an intertrial interval [ITI] of 250 ms), plus the words “No Fixation” when the participant had moved the eyes during the fixation task, or “No Eye Movement” when the participant had failed to make an eye movement in the saccade task (in which case the entire feedback display was presented for 1,250 ms, followed by an ITI of 750 ms).
In the saccade task, eye movements were attributed to a stimulus (target or nontarget) when the saccade (eye movements with a velocity > 30°/s or acceleration > 8,000°/s2) ended within 1.5° of the center of the stimulus.
Prior to the data analysis, outlier responses (RTs of <200 ms or >2,000 ms) and trials in which participants broke fixation (fixation task) or failed to saccade to a stimulus (saccade task) were excluded. This led to a loss of 7.68% of the data in the fixation task (failure to fixate: 6.96%) and 17.81% of the data in the saccade task (failure to saccade to a stimulus: 16.86%).
The results from the naming task showed that 100% of participants (n = 16) correctly categorized the blue colors, and 81% (n = 13) correctly categorized the green colors. The visual search results did not differ when participants with an incorrect categorization judgment were excluded (probably because this applied only to very few participants); hence, in the following report, all participants are included.
Mean errors for across- and within-category targets in the right and left hemifields (RVF, LVF)
First eye movements to the target
Analyzing the latencies of first eye movements to the target in the same manner revealed only a significant main effect of color category, F(1, 15) = 8.0, p = .013 (all other Fs < 3.4, ps > .08). To test whether the category effect was present in both the RVF and LVF, we computed two pair-wise t tests. The results showed a left-lateralized category effect (similar to the proportion of first eye movements): Although, numerically, the category advantages were similarly large for RVF targets (10 ms) and LVF targets (9 ms), the category effect was only significant for RVF targets, t(15) = 3.0, p = .008, not for LVF targets, t(15) = 1.5, p = .17. Thus, in line with earlier findings, early processes in visual search (as indexed by the accuracy and speed of the first eye movement in a trial) show a clear left-lateralized category effect.
Target fixation latencies
To probe category effects at a later stage of visual search, we next analyzed the time from the onset of the search display to the point in time at which the eyes first fixated the target. To ensure that the hemispheric mapping of the target was retained, only trials in which eye movements did not alter the hemispheric mapping of the target were included (i.e., the target was consistently in the LVF or RVF, despite eye movements; 73.2% of trials). The results showed only a significant main effect of target category, F(1, 15) = 27.88, p < .001, η p 2 = .65, with across-category targets being selected 25 ms earlier than within-category targets, but no significant effect of visual field or interaction, all Fs < 1.9, ps > .19. The category effects were significant for both RVF and LVF targets separately, t(15) = 3.41, p = .004, and t(15) = 3.59, p = .003, and there were no differences in search times for across-category targets in the LVF versus the RVF, t(15) = 1.14, p = .27. Thus, this later measure (of the time needed to visually select the target) showed a bilateral category effect.
The results of the first analyses showed an early left-hemispheric category effect in the accuracy of first saccades, which transformed into a bilateral category effect in later measures. It is interesting to note that observers were more accurate with first eye movements to RVF across-category targets than to LVF across-category targets, and yet they showed the same target fixation latencies for LVF and RVF across-category targets. This indicates that, after an initial disadvantage in localizing LVF across-category targets, these targets produced a stronger category effect that neutralized the initial advantage for RVF across-category targets.
In sum, the data show that the bilateral category effects observed in manual RTs consist of two components—an early, left-lateralized category effect when the target is selected immediately, and a right-lateralized category effect when observers initially miss the target and continue searching.
RT distribution analysis: Fixation task
Taken together, these results reflect an early left-hemispheric category effect in the fastest manual RTs of the fixation task, and a comparatively strong bilateral category effect in the slowest manual RTs. In this, the results mimic the results of the saccade task, which also showed an early, left-lateralized category effect and a bilateral category effect at a later stage of visual search.
The present findings resolve previously discrepant results regarding color category effects in visual search. The typical dependent variables used to index early processes in visual search (i.e., first eye movement parameters) yielded a clear left-lateralized category effect, whereas the typical dependent variable predominantly used in previous studies (manual RTs) showed a bilateral category effect—in both the fixation task and the saccade task. Interestingly, our left-lateralized category effect (observed in the proportions of first eye movements to the target in the saccade task and the fastest RTs in the fixation task) translated into a bilateral category effect in later stages of visual search (i.e., target fixation latencies in the saccade task, slowest RTs in the fixation task).
Previous studies have already shown that the left-lateralized category effect occurs at an early stage of visual search (Al-Rasheed et al., 2014; Franklin, Drivonikou, Bevis, et al., 2008a; Franklin, Drivonikou, Clifford, et al., 2008b; Liu et al., 2009; Mo, Xu, Kay, & Tan, 2011; Thierry et al., 2009), but it was unclear whether this left-lateralized category effect is only present at an early stage of visual search, or whether it has a more sustained, longer-lasting influence on search performance. Our findings suggest that the left-lateralized category advantage is only a transient, short-lived effect that is limited to an early stage of visual search.
Second, previous studies (e.g., Liu et al., 2009) documented that an early, left-lateralized category effect does not necessarily propagate to later measures such as the mean RT, which can show equally large category effects in both hemifields. The present study has clarified that (1) the early, left-lateralized category effect can only be observed when the target is found immediately and with the first glance, in which case (2) the left-lateralized category effect does propagate to later measures (e.g., the mean RT). Critically, a bilateral category effect emerges in later measures (e.g., mean RT), because (1) a late, right-lateralized category effect can emerge on trials in which the target is initially missed and distractor rejection becomes a key component of visual search, and (2) this effect adds together with the early, left-lateralized category effect to produce a bilateral category effect in mean RTs (or other late measures, such as the time to select the target).
The finding that the bilateral category effect consists of an early, left-lateralized component and a later, right-lateralized component has important theoretical implications. First, the present findings allow precise predictions about when the results of a given study will show a left-lateralized versus a bilateral category effect. Because the right-lateralized category effect emerged only when the target was initially missed, it follows that the data should show an early, left-lateralized category effect when search is easy (e.g., when the color contrast is high or with highly practiced observers), whereas bilateral category effects should emerge when search is more difficult (e.g., with less practiced observers; e.g., Franklin, Drivonikou, Bevis, et al., 2008a; Franklin, Drivonikou, Clifford, et al., 2008b). In addition, the choice of the dependent variable alone could already tip the results toward left-lateralized versus bilateral category effects. Early dependent measures (e.g., N2pc or first eye movement parameters) should be more likely to show a left-lateralized category effect, whereas later measures, such as the mean RT, should be equally likely to show a left-lateralized or bilateral category effect (depending on the ease of the search task; for corresponding results, see Al-Rasheed et al., 2014; Franklin, Drivonikou, Bevis, et al., 2008a; Franklin, Drivonikou, Clifford, et al., 2008a; Liu et al., 2009; Mo et al., 2011; Thierry et al., 2009). With this, the present account can explain a range of previous and apparently discrepant results.
Another important implication of the present results is that the finding of a bilateral category effect cannot be taken to refute the Whorfian hypothesis that language can affect early perceptual processes (e.g., Witzel & Gegenfurtner, 2013). As we demonstrated here, the bilateral category effect includes the left-lateralized category effect, which has often been cited in support of the Sapir–Whorf hypothesis. Naturally, it is also possible to explain the left-lateralized category effect in terms of categorical perception, without the need for a linguistic mechanism (e.g., Holmes & Wolff, 2012). Still, according to our findings, the left-lateralized category effect is based on early attentional or perceptual processes that facilitate the selection of across-category targets in the right hemifield—for instance, because the initial attentional bias for across-category targets is stronger, or perceptual sensitivity is initially enhanced for such targets in the right hemifield. With this, the left-lateralized category effect is similar to other short-lived attentional processes that can bias visual selection to particular items. Such attentional biases can also alter our perception of these stimuli, as attention can reliably accelerate the perceived timing of events (Hikosaka, Miyauchi, & Shimojo, 1999; Priess, Scharlau, Becker, & Ansorge, 2012) and increase the perceived contrast and resolution of the attended stimulus (Carrasco, Ling, & Read, 2004; Yeshurun & Carrasco, 1998). Given these findings, it is plausible that across-category targets are perceived differently in the right versus the left hemifield, whereas it still remains to be shown whether facilitated visual selection of across-category target in the RVF is indeed due to the involvement of language (e.g., due to the proximity of corresponding brain areas), or to a different, language-independent hemispheric specialization.
The major finding of the present study was that a large, right-lateralized category effect also facilitated detection of across-category targets in the left visual field when the target had been initially missed. Earlier studies had reported right-lateralized category effects for fine-grained line discriminations (Franklin et al., 2008a, b), suggesting that the right hemisphere may be specialized for subtle category differences (see also Holmes & Wolff, 2012). Our results support this idea, since erroneous selection of a nontarget would neutralize all initial top-down biases and require finding the target in terms of its then reduced categorical difference. Alternatively or additionally, the right hemisphere may be specialized for fast, parallel rejection of distractors (e.g., Polich, 1982). Although the exact cause of the right-lateralized category effect still needs to be determined, its discovery resolves the longstanding debate about whether category effects are lateralized or bilateral.
The green–blue boundary has been a popular boundary to investigate, and it has been argued that this boundary is most likely to result in perceptual confounds (see Witzel & Gegenfurtner, 2015, 2016, for details on the mechanisms specific to this boundary that are thought to result in spurious results).
This research was supported by an Australian Research Council (ARC) Future Fellowship (FT130101282), a Discovery Grant (DP170102559), and a University of Queensland Foundation Research Excellence Award granted to S.I.B.
- Becker, S. I., Harris, A. M., Venini, D., & Retell, J. D. (2014). Visual search for colour and shape: When is the gaze guided by feature relationships, when by feature values? Journal of Experimental Psychology: Human Perception and Performance, 40, 264–291. doi: 10.1037/a0033489 PubMedGoogle Scholar
- Becker, S. I., Lewis, A. J., & Axtens, J. E. (2017). Top-down knowledge modulates onset capture in a feedforward manner. Psychonomic Bulletin & Review. doi: 10.3758/s13423-016-1134-2
- Brown, A., Lindsey, D., & Guckes, K. (2011). Color names, color categories, and color-cued visual search: Sometimes, color perception is not categorical. Journal of Vision, 11(12), 2. doi: 10.1167/11.12.2
- Drivonikou, G. V., Kay, P., Regier, T., Ivry, R. B., Gilbert, A. L., Franklin, A., & Davies, I. R. L. (2007). Further evidence that Whorfian effects are stronger in the right visual field than the left. Proceedings of the National Academy of Sciences, 104, 1097–1102. doi: 10.1073/pnas.0610132104 CrossRefGoogle Scholar
- Franklin, A., Drivonikou, G. V., Bevis, L., Davies, I. R. L., Kay, P., & Regier, T. (2008a). Categorical perception of color is lateralized to the right hemisphere in infants, but to the left hemisphere in adults. Proceedings of the National Academy of Sciences, 105, 3221–3225. doi: 10.1073/pnas.0712286105
- Franklin, A., Drivonikou, G. V., Clifford, A., Kay, P., Regier, T., & Davies, I. R. L. (2008b). Lateralization of categorical perception of color changes with color term acquisition. Proceedings of the National Academy of Sciences, 105, 18221–18225. doi: 10.1073/pnas.0809952105
- Harnad, S. (1987). Categorical perception: The groundwork of cognition. Cambridge, UK: Cambridge University Press.Google Scholar
- Harnad, S. (2003). Categorical perception. In Encyclopedia of cognitive science (Vol. 1). Nature Publishing.Google Scholar
- Holmes, K. J., & Regier, T. (2017). Categorical perception beyond the basic level: The case of warm and cool colors. Cognitive Science. doi: 10.1111/cogs.12393
- Liu, Q., Li, H., Campos, J. L., Wang, Q., Zhang, Y., Qiu, J., . . . Sun, H.-J. (2009). The N2pc component in ERP and the lateralization effect of language on color perception. Neuroscience Letters, 454, 58–61. doi: 10.1016/j.neulet.2009.02.045
- Thierry, G., Athanasopoulos, P., Wiggett, A., Dering, B., Kuipers, J.-R., & Ungerleider, L. (2009). Unconscious effects of language-specific terminology on preattentive color perception. Proceedings of the National Academy of Sciences, 106, 4567–4570. doi: 10.1073/pnas.0811155106 CrossRefGoogle Scholar
- Witzel, C., & Gegenfurtner, K. (2013). Categorical sensitivity to color differences. Journal of Vision, 13(7), 1. doi: 10.1167/13.7.1
- Witzel, C., & Gegenfurtner, K. R. (2015). Categorical facilitation with equally discriminable colors. Journal of Vision, 15(8), 22. doi: 10.1167/15.8.22
- Wolfe, J. M. (1998). Visual search. In H. Pashler (Ed.), Attention (pp. 13–73). London, UK: University College London Press.Google Scholar
- Zhao, G., Liu, Q., Jiao, J., Zhou, P., Li, H., & Sun, H.-J. (2012). Dual-state modulation of the contextual cueing effect: Evidence from eye movement recordings. Journal of Vision, 12(6), 11. doi: 10.1167/12.6.11