Journal of Computational Neuroscience

, 21:5

Sound representation methods for spectro-temporal receptive field estimation

Authors

  • Patrick Gill
    • Biophysics GroupUniversity of California at Berkeley
  • Junli Zhang
    • Department of Psychology and Neurosciences InstituteUniversity of California at Berkeley
  • Sarah M. N. Woolley
    • Department of Psychology and Neurosciences InstituteUniversity of California at Berkeley
  • Thane Fremouw
    • Department of Psychology and Neurosciences InstituteUniversity of California at Berkeley
    • Biophysics GroupUniversity of California at Berkeley
    • Department of Psychology and Neurosciences InstituteUniversity of California at Berkeley
Article

DOI: 10.1007/s10827-006-7059-4

Cite this article as:
Gill, P., Zhang, J., Woolley, S.M.N. et al. J Comput Neurosci (2006) 21: 5. doi:10.1007/s10827-006-7059-4

Abstract

The spectro-temporal receptive field (STRF) of an auditory neuron describes the linear relationship between the sound stimulus in a time-frequency representation and the neural response. Time-frequency representations of a sound in turn require a nonlinear operation on the sound pressure waveform and many different forms for this non-linear transformation are possible. Here, we systematically investigated the effects of four factors in the non-linear step in the STRF model: the choice of logarithmic or linear filter frequency spacing, the time-frequency scale, stimulus amplitude compression and adaptive gain control. We quantified the goodness of fit of these different STRF models on data obtained from auditory neurons in the songbird midbrain and forebrain. We found that adaptive gain control and the correct stimulus amplitude compression scheme are paramount to correctly modelling neurons. The time-frequency scale and frequency spacing also affected the goodness of fit of the model but to a lesser extent and the optimal values were stimulus dependant.

Keywords

Receptive fieldZebra finchSTRFReverse correlationAuditory cortex

Copyright information

© Springer Science + Business Media, LLC 2006