Experimental Brain Research

, Volume 162, Issue 4, pp 509–512

Spatial and non-spatial auditory processing in the lateral intraparietal area

Authors

  • Gordon W. GiffordIII
    • Department of Psychological and Brain ScienceDartmouth College
    • Department of Psychological and Brain ScienceDartmouth College
    • Center for Cognitive NeuroscienceDartmouth College
Research Note

DOI: 10.1007/s00221-005-2220-2

Cite this article as:
Gifford, G.W. & Cohen, Y.E. Exp Brain Res (2005) 162: 509. doi:10.1007/s00221-005-2220-2

Abstract

We tested the responses of neurons in the lateral parietal area (area LIP) for their sensitivity to the spatial and non-spatial attributes of an auditory stimulus. We found that the firing rates of LIP neurons were modulated by both of these attributes. These data indicate that, while area LIP is involved in spatial processing, non-spatial processing is not restricted to independent channels.

Keywords

AuditoryLateral intraparietal sulcusSpatialNon-spatial

Introduction

The lateral intraparietal (LIP) area plays an important role in spatial perception and action (Kusunoki et al. 2000; Cohen and Andersen 2004). Since area LIP is part of the dorsal pathway and is involved primarily in spatial processing, it had been thought that LIP neurons were not modulated by the non-spatial attributes of a sensory stimulus (Ungerleider and Mishkin 1982). However, recent studies have provided data contrary to this hypothesis. For instance, LIP neurons are selective for the shape of a visual stimulus (Sereno and Maunsell 1998). Other experiments indicate that LIP neurons are modulated by the color of a visual stimulus when it is relevant to the successful completion of a behavioral task (Toth and Assad 2002).

Since area LIP combines input from different sensory modalities to form a multimodal representation of extra-personal space (Cohen and Andersen 2004), we hypothesized that area LIP may also process the non-spatial attributes (in other words, the spectrotemporal structure) of an auditory stimulus. We found that LIP neurons code significant amounts of information about both the spatial and non-spatial attributes of an auditory stimulus. These data indicate that, while area LIP is involved in spatial processing, spatial and non-spatial processing is not restricted to independent processing streams.

Methods

General methods

Rhesus monkeys (Macaca mulatta) were placed in front of a stimulus array. The array consisted of eight equally-spaced speakers that, relative to the monkeys position, formed a circle around a “central” speaker with a radius of 12°. Eye position was recorded with a scleral eye coil (1 kHz sampling rate) (Judge et al. 1980). Extracellular action potentials were recorded with tungsten electrodes (FHC Inc.) that were inserted into a recording chamber (Crist Instruments). The location of the lateral bank of the parietal cortex was determined by visualizing a recording electrode in the posterior parietal cortex of each monkey with magnetic-resonance images (Gifford III and Cohen 2004; Cohen et al. 2004). LIP neurons were identified by their visual, perisaccadic, and saccadic responses. All surgical procedures and protocols were approved by Dartmouth College’s Institutional Animal Care and Use Committee and were in accordance with the “Principles of Animal Care” (NIH Publication No. 86–23, revised 1985). The data reported in this paper were collected as part of a previous study that investigated LIP auditory activity in monkeys that had not been trained to associate an auditory stimulus with an action (Gifford III and Cohen 2004).

Stimuli

Auditory stimuli

Two classes of auditory stimuli were used: species-specific vocalizations (SSVs) and band-pass noise. An important feature of these classes is that their spectrotemporal structures, as measured through Wiener entropy (Tchernichovski et al 2000), are significantly different (Kolmogorov-Smirnov statistic). The SSVs were recorded and digitized as part of an earlier set of studies (Hauser 1998). “Fresh” exemplars of band-pass noise (pass-band = 0.55–15.25 kHz) were generated on each trial in a digital-signal-processing environment based on the AP2 DSP card (Tucker Davis Technologies). The durations of the noise bursts matched the variance in the duration of the SSVs (mean = 326 ms; SD=129 ms). Each auditory stimulus was presented at a sound level of 65 dB SPL. The stimuli were presented through a D/A converter (DA1, Tucker Davis Technologies), an amplifier (SA1, Tucker Davis Technologies, and MPA-250, Radio Shack), and transduced by a speaker (Pyle, PLX32).

Visual stimuli

Visual stimuli were produced by a red LED that was mounted and centered on each speaker.

Behavioral tasks

In the “visual-saccade” task, 500–1,000 ms after fixating the LED mounted on the central speaker (the “central LED”), one of the eight peripheral LEDs was illuminated. After an additional 500–1,000 ms, the central LED was extinguished, signaling the monkeys to shift their gaze to the illuminated peripheral LED.

In the “gap-fixation” task, 1,000–1,500 ms after fixating the central LED, it was extinguished; the monkeys, though, were required to maintain their gaze at the location of the extinguished central LED. 300–500 ms later, an auditory stimulus was presented. 700–800 ms after auditory-stimulus offset, the central LED was re-illuminated and the monkeys continued to maintain their gaze at its location for an additional 500–1,000 ms. During this task, monkeys kept their gaze within 1.5° of this fixation point and did not systematically vary their eye position with auditory-stimulus location.

Recording strategy

To minimize selection bias, the activity of any well-isolated neuron was recorded. The monkeys first participated in a block of trials of the visual-saccade task. LIP activity during this task was correlated with the location of the peripheral LED to construct a spatial response field. The visual-stimulus location that elicited the highest firing rate during the period in which the peripheral LED was illuminated was designated as the “IN” location. The location 180° contralateral was the “OUT” location. If a LIP neuron was not modulated during the visual-saccade task, we operationally defined the IN and OUT locations as the speaker locations 12° to the right and left, respectively, of the central LED. Since LIP visual activity predicts the presence of auditory responses (Mazzoni et al. 1996; Linden et al. 1999; Gifford III and Cohen 2004), defining the IN and OUT locations through the visual-saccade task did not bias us against finding auditory LIP neurons. Next, the monkeys participated in a block of trials of the gap-fixation task. The two locations of the auditory stimuli (IN or OUT) and the two auditory-stimulus classes (band-pass noise or SSVs) were varied randomly on a trial-by-trial basis.

Data analysis

Neural activity during the gap-fixation task was tested during the “stimulus” periods. The stimulus period began at auditory-stimulus onset and ended at its offset; the time of occurrence of action potentials was aligned relative to auditory-stimulus onset. Data were analyzed in terms of a neuron’s firing rate (the number of action potentials divided by task-period duration).

Two sets of analysis examined how auditory-stimulus location and class modulated LIP firing rate. In the first analysis, on a neuron-by-neuron basis, a two-factor (auditory-stimulus location × auditory-stimulus class) ANOVA tested whether the stimulus-period firing rate was modulated during the gap-fixation task. In a second analysis, we used an information analysis (Cover and Thomas 1991), similar to one that we have described previously (Cohen et al. 2002; Gifford III and Cohen 2004), to quantify the amount of “auditory-location” information and “auditory-class” information carried in the stimulus-period firing rate. Auditory-location information was the amount of information regarding differences in auditory-stimulus location, independent of class. Auditory-class information was the amount of information regarding differences in auditory-stimulus class, independent of location. Bit rates are reported in terms of “relative” information, which takes into account the amount of information carried in the firing rate due to chance (Panzeri and Treves 1996).

Results

We recorded from 70 LIP neurons from the left hemispheres of two monkeys during the visual-saccade and gap-fixation tasks. LIP neurons were modulated by the spatial or the non-spatial attributes of an auditory stimulus. Three exemplar neurons are shown in Fig. 1. The neuron in Fig. 1A was sensitive to the non-spatial attributes (Friedman’s test of neural activity during the stimulus period, χ2=4.01, P<0.05) of an auditory stimulus but not their spatial attributes (χ2=2.46, P>0.05). This neuron was modulated by SSVs at both the IN and OUT locations. In contrast, while having a weak firing rate, the neuron in Fig. 1B was sensitive to the spatial attributes (χ2=7.82, P<0.05) of an auditory stimulus but not their non-spatial attributes (χ2=0.63, P>0.05). This neuron was modulated by SSVs and band-pass noise, but only at the IN location. Finally, Fig. 1C illustrates a neuron that was modulated by the combination of spatial (χ2=6.59, P<0.05) and non-spatial (χ2=7.29, P<0.05) attributes: this neuron was modulated by SSVs at the IN location.
Fig. 1A–C

Response fields of LIP neurons that are modulated by A auditory-stimulus class, B auditory-stimulus location, and C auditory-stimulus class and location. In each panel, the upper and lower panels illustrate data collected in response to band-pass noise and SSVs, respectively. The plots on the left are data collected when the auditory stimulus was at the IN location and those on the right from the OUT location. The rasters and spike-density histograms are aligned relative to auditory-stimulus onset. The data in A were adapted from Gifford and Cohen (2004)

Population analyses confirmed that auditory-stimulus location and class modulated LIP activity. First, a significant (binomial probability, P<0.05) proportion (n=9/70) of LIP neurons was modulated (ANOVA, P<0.05) by auditory-stimulus location, whereas another significant (binomial probability, P<0.05) proportion (n=15/70) of LIP neurons was modulated (ANOVA, P<0.05) by auditory-stimulus class. One neuron (n=1/70) was modulated by both auditory-stimulus location and class, a proportion that was not different from that expected by chance (binomial probability, P>0.05). Second, the mean amount of auditory-location information (0.02±0.05 bits; Fig. 2A) and auditory-class information (0.014±0.04 bits; Fig. 2B) were significantly greater than zero bits (one-way t-test, P<0.05: T(69)=3.0 and T(69)=3.7, respectively). Finally, we examined whether the degree of spatial modulation depended on auditory-stimulus class. We found that the mean amount of information regarding the location of SSVs was not reliably (two-way t-test, P>0.05) different than the mean amount of information regarding the location of band-pass noise.
Fig. 2A–B

Information analyses. The distributions of A auditory-location information and B auditory-class information for 70 LIP neurons

Discussion

Lateral intraparietal neurons were modulated by the spatial and non-spatial attributes of an auditory stimulus. Our results support and extend previous investigations into the role of non-spatial processing in area LIP. Similar to previous visual studies of non-spatial processing (Sereno and Maunsell 1998; Toth and Assad 2002), our data suggest that LIP neurons code the non-spatial attributes of auditory stimuli. Also, this study, like that of Sereno and Maunsell (1998), indicates that non-spatial coding may be a fundamental property of area LIP and not the result of operant training in which monkeys learn to associate non-spatial features of a stimulus with a behavioral task to receive a reward.

How do these results fit into our conceptualization of cortical sensory processing? Classically, the spatial and non-spatial attributes of visual and auditory stimuli are thought to be processed in parallel channels (Ungerleider and Mishkin 1982; Rauschecker and Tian 2000). Our data and other studies (Ferrera et al. 1992; Sereno and Maunsell 1998; Toth and Assad 2002) suggest that these channels are not strictly parallel but, instead, may be interconnected (Ferrera et al. 1992; Sereno and Maunsell 1998; Middlebrooks 2002; Recanzone 2002; Toth and Assad 2002). While the utility of this non-spatial information in area LIP is unknown, we hypothesize that the mix of spatial and non-spatial processing may benefit the computations underlying spatial processing in area LIP when the non-spatial attributes provide behaviorally relevant information about a stimulus (Ferrera et al. 1994; Kusunoki et al. 2000; Toth and Assad 2002).

Acknowledgements

The authors acknowledge A. Underhill for help with animal care and training. The comments of J. Groh, B. Russ, K. MacLean, and H. Hersh were appreciated. M. Hauser provided the recordings of the rhesus vocalizations. YEC was supported by grants from the Whitehall Foundation and NIH, the Class of 1962 Faculty Fellowship, and a Burke Award.

Copyright information

© Springer-Verlag 2005