Guest editorial: analysis of neural data

  • Robert E. KassEmail author

This special issue is an outgrowth of the fourth international symposium on Statistical Analysis of Neural Data (SAND4). The first such symposium was conceived a decade ago as a response to a striking anomaly: although quantitative methods had had a profound influence on neuroscience, there had been remarkably little interplay between neuroscience and contemporary statistical practice. Major advances in recording and imaging technologies had put into the hands of investigators wonderful tools for conducting previously unimagined experiments, yielding rich neuroscientific data sets. These data sets were, however, often large and complex, so that novel methods of analysis would be needed if the wealth of new information were to be turned into useful knowledge. Furthermore, while there had been an explosive growth in statistical and machine learning techniques, it was apparent that the most effective approaches often had to be tailored for particular application areas and very little of this had taken place in neuroscience. There was thus an urgent need for specific methodological developments in the context of particular neuroscientific applications. The SAND symposia therefore aimed to encourage discussion and dissemination of methodology for the analysis of neural data. Following the fourth symposium in 2008 a call for papers was posted publicly, and this special issue is the end result.

Since the time of the first SAND meeting in 2002, improved methods have been absorbed by the computational neuroscience community remarkably quickly. The primary goal of SAND, originally, was to define data-analytical problems at the cutting edge of neuroscientific research. At the present juncture, basic strategies have been articulated and developed, and it is fair to say that analysis of neural data has matured from a disparate collection of often-murky problems to a well-defined domain for research. What role the SAND meetings may have played in this evolution is hard to say, but it is clear that there have been important advances. The papers in this special issue represent current research directions and results in the analysis of neural data. A caveat is that neuroimaging is under-represented here. This is, presumably, partly because computational work can be published in some of the neuroimaging journals and partly because neuroimaging is relatively removed from the traditional concerns of computational neuroscience. This special issue focuses mainly on analysis of data recorded from single and multiple electrodes—especially involving spike trains. Of the 23 papers appearing here, only one deals with a subject not involving electrophysiology. In their paper “Rapid determination of particle velocity from space-time images using the Radon transform,” Drew, Blinder, Cauwenberghs, Shih, and Kleinfeld, discuss improved methods for laser scanning microscopy. Two papers are concerned with analysis of EEG data: Bhattacharya and Pereda, “An index of signal mode complexity based on orthogonal transformation” and He and Thomson, “Canonical bicoherence analysis of dynamic EEG data.” In “Time series analysis of hybrid neurophysiological data and application of mutual information,” Biswas and Guha take up the problem of relating a spike train to a continuous signal such as EMG. Two papers focus on analysis of local field potentials: Saleem, Chadderton, Apergis-Schoute, Harris, and Schultz, “Methods for predicting cortical UP and DOWN states from the phase of deep layer local field potentials,” and Stamoulis and Richardson, “Application of matched filtering to identify behavioral modulation of brain oscillations.”

Three of the papers concern state space models for neural data: Koyama, Chase, Whitford, Velliste, Schwartz, and Kass, “Comparison of brain-computer interface decoding algorithms in open-loop and closed-loop control,” Koyama and Paninski, “Efficient computation of the maximum a posteriori path and parameter estimation in integrate-and-fire and more general state-space models,” and Paninski, Ahmadian, Ferreira, Koyama, Rad, Vidne, Vogelstein, and Wu, “A new look at state-space models for neural data.”

The remainder of the papers are about various aspects of single and multi-electrode spike train analysis. Spike sorting is discussed by Franke, Natora, Boucsein, Munk, and Obermayer in their paper “An online spike detection and spike classification algorithm capable of instantaneous resolution of overlapping spikes” and four articles present methods for individual spike trains: Endres and Oram, “Feature extraction from spike trains with Bayesian binning: ‘Latency is where the signal starts’,” Shimazaki and Shinomoto, “Kernel bandwidth optimization in spike rate estimation,” Shimokawa, Koyama and Shinomoto, “A characterization of the time-rescaled gamma process as a model for spike trains,” and Trevino, Coleman, and Allen, “A dynamical point process model of auditory nerve spiking in response to complex sounds.” Characterization of bursts is discussed for single spike trains by Tokdar, Xi, Kelly, and Kass, “Detection of bursts in extracellular spike trains using hidden semi-Markov point process models,” and for multiple spike trains by Pasquale, Martinoia, and Chiappalone, “A self-adapting approach for the detection of bursts and network bursts in neuronal cultures.” Several important problems in multiple spike train analysis are attacked in six papers: Echtermeyer, Smulders, and Smith, “Causal pattern recovery from neural spike train data using the Snap Shot Score,” Gourévitch and Eggermont, “Maximum decoding abilities of temporal patterns and synchronized firings: applications to auditory neurons responding to click trains and amplitude modulated white noise,” Gürel, Rotter, and Egert, “Functional identification of biological neural networks using reservoir adaptation for point processes,” Krumin, Shimron, and Shoham, “Correlation-distortion based identification of Linear-Nonlinear-Poisson models,” Peyrache, Benchenane, Khamassi, Wiener, and Battaglia, “Principal component analysis of ensemble recordings reveals cell assemblies at high temporal resolution,” and Staude, Rotter, and Grün, “CuBIC: cumulant based inference of higher-order correlations in massively parallel spike trains.”

Many of these papers resulted from work presented and discussed at SAND4 and I therefore gratefully acknowledge the support of NIMH and NSF, without which the conference would not have been possible.

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  1. 1.Department of Statistics, Machine Learning Department, and Center for the Neural Basis of CognitionCarnegie Mellon UniversityPittsburghUSA

Personalised recommendations