Our perceptions of the world are often determined by the way different sensory modalities interact. In the case of hearing, for example, the ability of human listeners to identify sounds such as speech—particularly against a noisy background—or to judge their locations can be profoundly influenced by the availability of concurrent visual cues (reviewed by Alais et al. 2010). It is therefore not surprising that loss of vision can result in changes in auditory perceptual abilities and in the way sounds are processed within the brain. The nature and extent of these changes depends, however, on numerous factors, including the age at onset and the severity and duration of blindness, the aspect of auditory perception that is measured and, almost certainly, on the degree to which visually-deprived subjects have come to depend on their hearing in their everyday lives (King 2014; Lazzouni and Lepore 2014). A range of auditory and other sensory functions can be altered as a result of blindness but because of the particular importance of vision and hearing for spatial perception and navigation, this article will focus primarily on the impact of blindness on sound localization abilities and the underlying neural substrates.

Visual influences on sound localization in adulthood

Objects in the external world are often both seen and heard, thereby providing multiple cues about their spatial location. Combining the information available to each sensory system can result in a better estimate of object location than would be possible using either system in isolation (Alais et al. 2010). Indeed, sound localization accuracy declines in darkness (Lewald et al. 2000) or if subjects are blindfolded (Tabry et al. 2013). The impact of vision is further illustrated by the demonstration that misaligned visual cues can bias or capture the perceived location of a sound source, which forms the basis for the so-called ventriloquist illusion (Alais et al. 2010).

This visual dominance arises because the retina provides the brain with relatively high-resolution and reliable spatial information about the visual world, whereas sound localization is based on the detection and interpretation of spatial cues that vary in their usefulness with the amplitude and frequency composition of the sound and the region of space in which it needs to be localized (Schnupp et al. 2011). Importantly, however, blurring visual stimuli so that they become harder to localize causes the ventriloquist illusion to work in reverse, with spatially disparate sounds now biasing visual judgments (Alais and Burr 2004). Consequently, rather than vision having an inherent spatial advantage over hearing, it is more appropriate to regard the process by which they are combined in the brain as an example of optimal cue integration, with the weights given to each cue being proportional to the relative reliability of that cue.

Studies of young children suggest that the developing brain may not integrate multisensory signals in the same statistically optimal fashion (Gori et al. 2008, 2012). Instead, it appears that children show strong unisensory dominance, with the dominant sensory modality depending on the task in question, which for visual–auditory space perception is vision (Gori et al. 2012). These authors proposed that the dominant modality calibrates the others during development, with vision therefore contributing to the emergence of auditory spatial abilities. This is consistent with neurophysiological evidence in animals that vision plays the key role in aligning the maps of space for different sensory modalities in the superior colliculus (King 2009). In this case, the developing auditory map is refined by experience so that it becomes aligned with the representation of visual space.

In view of these findings, it might be expected that early blindness would disrupt the normal process of crossmodal calibration, potentially altering the development of sound localization abilities. Indeed, both haptic (Gori et al. 2010) and auditory (Gori et al. 2014) processing deficits have been described in visually impaired individuals. On the other hand, studies in humans (Lazzouni and Lepore 2014) and other species (Rauschecker 1995) have shown that loss of one sensory modality can trigger a functional reorganization, particularly at the level of the cerebral cortex, expanding the neural territory available for processing information provided by the intact senses. Moreover, since blind individuals have to rely more on their hearing for their spatial awareness, it is likely that use-dependent plasticity will sharpen their auditory spatial skills, in much the same way that blind individuals who are adept at reading Braille show heightened tactile spatial acuity that is specific to their reading finger (Wong et al. 2011). These changes therefore have the potential to compensate for the lack of vision.

Sound localization abilities in the blind

A number of studies have reported that the ability of early blind subjects to localize sound differs from that of sighted controls. To some extent, methodological differences between these studies limit the conclusions that can be drawn but there is now clear evidence that visually impaired humans (Ashmead et al. 1998; Lessard et al. 1998; Röder et al. 1999) and other species (Rauschecker 1995; King and Parsons 1999) can localize sounds in the horizontal plane as well as and often better than sighted individuals. These enhanced auditory spatial abilities are more pronounced for peripheral than for central regions of space (Rauschecker 1995; King and Parsons 1999; Röder et al. 1999; Voss et al. 2004).

Sound localization in the horizontal plane depends principally on binaural spatial cues, i.e., differences between the two ears in sound level and arrival time, which vary systematically in value with the angular direction of the sound source relative to the head. Although experience-dependent plasticity in the processing of binaural cues can occur (Keating and King 2013; Keating et al. 2015), there is evidence that heightened sensitivity to spectral localization cues—which result from the direction-dependent filtering properties of the head and external ears—underpins the superior localization abilities shown by some blind individuals (Doucet et al. 2005; Voss et al. 2011). This is consistent with a greater dependence on monaural spectral cues for localization in the horizontal plane in individuals in which binaural cues are compromised as a result of hearing loss in the contralateral ear (Kumpik et al. 2010; Keating et al. 2013; Agterberg et al. 2014).

Not all studies, however, have reported improved spatial hearing in blind subjects. In contrast to the superior azimuthal performance that is often observed, localization in the midsagittal plane tends to be worse than in sighted controls (Zwiers et al. 2001; Lewald 2002). Because vertical localization is based primarily on spectral cues (Carlile et al. 2005; Tollin et al. 2013), this result seems difficult to reconcile with an improvement in sensitivity to these cues providing the basis for more accurate localization in the horizontal plane. It seems likely that different spectral features are selectively weighted under these conditions but unravelling how this happens will require a better understanding of how spectral shape information is processed in the brain and integrated with binaural inputs. In addition to their impaired elevation judgments, blind subjects appear to struggle with more complex spatial tasks, illustrated by the raised thresholds exhibited relative to normal-sighted controls on a task in which they had to estimate the relative location of the second sound source in a sequence of three sounds presented from different directions in the horizontal plane (Gori et al. 2014). Together, these findings show that only some aspects of spatial hearing improve following blindness, which highlights the importance of the choice of behavioral task when investigating the crossmodal plasticity that results from loss of vision.

Neural substrates for altered hearing in the blind

As previously mentioned, visual cues are used to calibrate the developing auditory space map in the superior colliculus (King 2009). Removal of these guiding signals in infancy has been found to impair the maturation of auditory spatial tuning in this midbrain nucleus, with the extent of the changes observed varying with the species and method of visual deprivation used (King and Carlile 1993; Withington-Wray et al. 1990; Wallace et al. 2004). For example, binocular eyelid suture in young ferrets results in normal auditory spatial selectivity in the superior colliculus but an abnormally high proportion of neurons that are ambiguously tuned to two different sound directions (King and Carlile 1993). Lid-sutured ferrets show no impairment, however, in the accuracy with which they approach sound sources in the horizontal plane (Kacelnik et al. 2006), while their spatial acuity in the lateral sound field is as good as and sometimes superior to that of normal-sighted control animals (King and Parsons 1999) (Fig. 1). Complete elimination of visual cues by rearing either guinea pigs (Withington-Wray et al. 1990) or cats (Wallace et al. 2004) in the dark has a more disruptive effect on auditory space map development. Although the sound localization abilities of these animals were not tested, it is likely that the crossmodal plasticity observed in the superior colliculus is more related to the maturation of a capacity for integrating spatial information across the senses than the acquisition of spatial hearing abilities per se.

Fig. 1
figure 1

Effects on auditory spatial acuity of depriving ferrets of patterned visual cues during development. a Testing arena used to measure the spatial acuity of adult animals at the midline. The ferrets were also tested in the lateral sound field with the loudspeakers placed symmetrically around 45° to one side. A trial was initiated when a ferret stood on the central start platform, placed its head through the head grid and made contact with the center spout. This triggered the presentation of a noise burst, which was selected at random from one of the two loudspeakers. In response, the ferret had to approach and lick the response spout positioned closest to the loudspeaker. b, c Logistic curves fitted to the psychometric functions in the lateral sound field for 4 visually-deprived ferrets. The shaded region represents the range of data from a normal-sighted control group. The stimuli were either 100 ms (b) or 40 ms (c) in duration. The performance of the visually-deprived animals was less variable than that of the sighted ferrets and their psychometric functions fell either just above or in the upper range for the control group. No difference was found, however, at the midline. Adapted, with permission, from King and Parsons (1999)

Surprisingly little work has been done on the effects of visual deprivation on the spatial selectivity of auditory cortical neurons. Korte and Rauschecker (1993) reported that neurons recorded in the anterior auditory cortex of lid-sutured cats show sharper spatial tuning compared to that seen in control animals. Although the anterior auditory field is not thought to contribute to the sound localization abilities of normal cats, the region sampled in this study included the anterior ectosylvian sulcus, which is necessary for determining sound source location (Lomber et al. 2007). Other studies have reported an expansion in the size of the tonotopically organized core auditory cortex in blind human subjects (Elbert et al. 2002), while placing young mice in the dark for a few days leads to sharper frequency selectivity and an improved capacity to discriminate changes in sound frequency and level among neurons in the primary auditory cortex (Petrus et al. 2014). It is possible that such changes contribute to the enhanced pitch discrimination reported in early blind human listeners (Gougoux et al. 2004).

Improvements in hearing abilities following blindness may also result from an increase in cortical territory devoted to auditory processing (Rauschecker 1995; Lazzouni and Lepore 2014). In particular, a number of studies have described structural and functional changes within the occipital cortex, with regions that would normally be involved in visual functions now responding to sound. Several lines of evidence suggest that this crossmodal reorganization is behaviorally relevant. First, the extent to which visual cortex is activated in blind subjects correlates with their sound localization accuracy (Gougoux et al. 2005; Voss et al. 2011). Second, functionally appropriate visual cortical areas are recruited following blindness, a principle that underpins the use of sensory substitution devices that convert visual information into auditory signals (Striem-Amit et al. 2012). For example, regions of the occipital cortex activated during auditory spatial processing in early blind individuals are also involved in visual spatial processing tasks in normal-sighted subjects (Renier et al. 2010; Collignon et al. 2011). Third, activation of visual cortical areas can be directly related to perception. Thus, unlike sighted controls, the perceived direction of auditory motion can be accurately classified from activity recorded in the middle temporal complex but not from auditory cortical areas (Jiang et al. 2014). Furthermore, application of repetitive transcranial magnetic stimulation (rTMS) to the right dorsal extrastriate cortex impairs auditory spatial but not pitch or level discrimination in blind subjects, whereas this has no effect on the ability of sighted subjects to perform any of these tasks (Collignon et al. 2007) (Fig. 2).

Fig. 2
figure 2

a Functional reorganization of the visual cortex in blind subjects contributes to their improved performance in auditory and tactile tasks. In particular, occipital cortex activity correlates with superior performance of blind subjects when they localize sound using spectral shape cues. b Transient disruption of the right dorsal extrastriate cortex via rTMS (gray bars) produces a significant increase in auditory localization error rate in blind but not in sighted subjects. Adapted, with permission, from Collignon et al. (2007)

Although the age at which vision is lost has a bearing on the subsequent crossmodal plasticity in the brain and the way perceptual abilities change (Lazzouni and Lepore 2014), superior sound localization has been observed following both early and late visual deprivation (King and Parsons 1999; Voss et al. 2004; Fieger et al. 2006). Even more striking is the finding that blindfolding normal-sighted adult humans for short periods can result in a transient increase in sound localization accuracy (Lewald 2007). Since a 5-day visual deprivation period can lead to behaviorally-relevant tactile responses in the occipital cortex (Merabet et al. 2008), it is likely that these short-term changes reflect the unmasking of existing connections between brain regions that process inputs from different sensory modalities. In addition, more prolonged deprivation originating early in development probably results in abnormal patterns of connectivity. Indeed, it has been reported that early blind subjects show increased stimulus-dependent coupling between activity in auditory and visual cortical areas (Schepers et al. 2012), as well as greater auditory to visual cortex connectivity compared with sighted controls (Klinge et al. 2010). Other recent evidence suggests, however, that functional connectivity in early blind subjects is actually reduced between visual and auditory cortices but increased between visual cortex and frontal and parietal areas involved in cognitive processing. This suggests that occipital cortex activation during auditory tasks may reflect stronger top–down attentional control in blind subjects (Burton et al. 2014).

Concluding remarks

A large number of studies have now examined the impact of loss of vision on auditory (and tactile) processing. Because of the greater reliance that visually-deprived humans and animals have to place on their intact sensory modalities, it is not surprising that their behavioral performance is often superior to that of sighted controls. Nevertheless, even for the same function, such as sound localization, conflicting results have been obtained. Although some of this variation can be attributed to methodological differences between studies and to differences in the age at onset, duration and severity of blindness, it is clear that certain aspects of spatial hearing can be enhanced, whereas others are impaired in visually-deprived individuals. This likely reflects a trade-off between the lifelong influence of vision in calibrating neural representations of auditory space and the compensatory crossmodal plasticity that results from a combination of reduced inputs in one sensory modality and increased use of the others. In addition, the opposing effects of blindness on localization based on auditory spectral cues in the horizontal and vertical planes suggest that there may be a limited capacity for experience-dependent improvements in neural processing. Further investigation, particularly at a neuronal level, of the location and nature of the changes that take place following blindness will be key to revealing both the mechanisms underlying crossmodal plasticity and the functional and modality specificity of different brain regions.