Date: 11 Jun 2011
The Ising decoder: reading out the activity of large neural ensembles
Rent the article at a discountRent now
* Final gross prices may vary according to local VAT.Get Access
The Ising model has recently received much attention for the statistical description of neural spike train data. In this paper, we propose and demonstrate its use for building decoders capable of predicting, on a millisecond timescale, the stimulus represented by a pattern of neural activity. After fitting to a training dataset, the Ising decoder can be applied “online” for instantaneous decoding of test data. While such models can be fit exactly using Boltzmann learning, this approach rapidly becomes computationally intractable as neural ensemble size increases. We show that several approaches, including the Thouless–Anderson–Palmer (TAP) mean field approach from statistical physics, and the recently developed Minimum Probability Flow Learning (MPFL) algorithm, can be used for rapid inference of model parameters in large-scale neural ensembles. Use of the Ising model for decoding, unlike other problems such as functional connectivity estimation, requires estimation of the partition function. As this involves summation over all possible responses, this step can be limiting. Mean field approaches avoid this problem by providing an analytical expression for the partition function. We demonstrate these decoding techniques by applying them to simulated neural ensemble responses from a mouse visual cortex model, finding an improvement in decoder performance for a model with heterogeneous as opposed to homogeneous neural tuning and response properties. Our results demonstrate the practicality of using the Ising model to read out, or decode, spatial patterns of activity comprised of many hundreds of neurons.
Action Editor: Jonathan David Victor
Bair, W., Zohary, E., & Newsome, W. T. (2001). Correlated firing in macaque visual area MT: Time scales and relationship to behavior. Journal of Neuroscience, 21(5), 1676–1697.PubMed
Berger, A. L., Pietra, V. J. D., & Pietra, S. A. D. (1996). A maximum entropy approach to natural language processing. Computational Linguistics, 22(1), 39–71.
Bishop, C. M. (2007). Pattern recognition and machine learning (Information science and statistics) (1st ed., 2006; corr. 2nd printing ed.). New York: Springer.
Broderick, T., Dudík, M., Tkačik, G., Schapire, R. E., & Bialek, W. (2007). Faster solutions of the inverse pairwise Ising problem. arXiv:0712.2437v2.
Das, A., & Gilbert, C. D. (1999). Topography of contextual modulations mediated by short-range interactions in primary visual cortex. Nature, 399, 643–644.CrossRef
Földiák, P. (1993). The ‘ideal homunculus’: Statistical inference from neural population responses. In Computation and neural systems (pp. 55–60). Norwell: Kluwer Academic.CrossRef
Hertz, J., Roudi, Y., Thorning, A., Tyrcha, J., Aurell, E., & Zeng, H. L. (2010). Inferring network connectivity using kinetic Ising models. BMC Neuroscience, 11(Suppl 1), P51.CrossRef
Higham, N. J. (2002). Computing the nearest correlation matrix—A problem from finance. IMA Journal of Numerical Analysis, 22(3), 329.CrossRef
Huang, F., & Ogata, Y. (2001). Comparison of two methods for calculating the partition functions of various spatial statistical models. Australian & New Zealand Journal of Statistics, 43(1), 47–65.CrossRef
Jaynes, E. T. (1957). Information theory and statistical mechanics. Physical Review, 106(4), 620–630.CrossRef
Kappen, H. J., & Rodríguez, F. B. (1998). Efficient learning in Boltzmann machines using linear response theory. Neural Computation, 10(5), 1137–1156.CrossRef
Mak, J. N., & Wolpaw, J. R. (2009). Clinical applications of brain–computer interfaces: Current state and future prospects. Reviews on Biomedical Engineering, 2, 187–199.
Ogata, Y., & Tanemura, M. (1984). Likelihood analysis of spatial point patterns. Journal of the Royal Statistical Society. Series B, 46(3), 496–518.
Plefka, T. (2006). Expansion of the Gibbs potential for quantum many-body systems: General formalism with applications to the spin glass and the weakly nonideal Bose gas. Physical Review E, 73(1), 016129.CrossRef
Pola, G., Thiele, A., Hoffmann, K., & Panzeri, S. (2003). An exact method to quantify the information transmitted by different mechanisms of correlational coding. Network-Computation in Neural Systems, 14(1), 35–60.CrossRef
Roudi, Y., Aurell, E., & Hertz, J. A. (2009a). Statistical physics of pairwise probability models. Frontiers in Computational Neuroscience, 3:22, 1–15.
Roudi, Y., & Hertz, J. (2011). Mean field theory for non-equilibrium network reconstruction. Physical Review Letters, 106, 048702.CrossRef
Roudi, Y., Nirenberg, S., & Latham, P. E. (2009b). Pairwise maximum entropy models for studying large biological systems: When they can work and when they can’t. PLoS Computational Biology, 5(5), e1000380.CrossRef
Roudi, Y., Tyrcha, J., & Hertz, J. (2009c). Ising model for neural data: Model quality and approximate methods for extracting functional connectivity. Physical Review E, 79(5), 051915 (12 pages).
Seiler, H., Zhang, Y., Saleem, A., Bream, P., Apergis-Schoute, J., & Schultz, S. R. (2009). Maximum entropy decoding of multivariate neural spike trains. BMC Neuroscience, 10(Suppl 1), P107.CrossRef
Sohl-Dickstein, J., Battaglino, P., & DeWeese, M. R. (2009). Minimum probability flow learning. arXiv:0906.4779v2.
Tanaka, T. (1998). Mean-field theory of Boltzmann machine learning. Physical Review E, 58(2), 2302–2310.CrossRef
Thouless, D. J., Anderson, P. W., & Palmer, R. G. (1977). Solution of solvable model of a spin glass. Philosophical Magazine, 35(3), 593–601.CrossRef
- The Ising decoder: reading out the activity of large neural ensembles
Journal of Computational Neuroscience
Volume 32, Issue 1 , pp 101-118
- Cover Date
- Print ISSN
- Online ISSN
- Springer US
- Additional Links
- Neural coding
- Decoding algorithm
- Brain machine interface
- Brain computer interface
- Visual cortex
- Spatial patterns
- Industry Sectors