Encyclopedia of Computational Neuroscience

2015 Edition
| Editors: Dieter Jaeger, Ranu Jung

Estimating Information-Theoretic Quantities

  • Robin A. A. Ince
  • Simon R. Schultz
  • Stefano Panzeri
Reference work entry
DOI: https://doi.org/10.1007/978-1-4614-6675-8_140

Definition

Information theory is a practical and theoretic framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. It has a number of useful properties: it is a general measure sensitive to any relationship, not only linear effects; it has meaningful units which in many cases allow direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single trials, rather than in averages over multiple trials. A variety of information-theoretic quantities are in common use in neuroscience (see entry “ Summary of Information-Theoretic Quantities”). Estimating these quantities in an accurate and unbiased way from real neurophysiological data frequently presents challenges, which are explained in this entry.

Detailed Description

Information-Theor...

This is a preview of subscription content, log in to check access

Notes

Acknowledgments

Research is supported by the SI-CODE (FET-Open, FP7-284533) project and by the ABC and NETT (People Programme Marie Curie Actions PITN-GA-2011-290011 and PITN-GA-2011-289146) projects of the European Union’s Seventh Framework Programme FP7 2007–2013.

References

  1. Arabzadeh E, Panzeri S, Diamond ME (2004) Whisker vibration information carried by rat barrel cortex neurons. J Neurosci 24:6011–6020PubMedGoogle Scholar
  2. Brenner N, Strong SP, Koberle R, Bialek W, Steveninck RRR (2000) Synergy in a neural code. Neural Comput 12:1531–1552PubMedGoogle Scholar
  3. Clifford CWG, Ibbotson MR (2000) Response variability and information transfer in directional neurons of the mammalian horizontal optokinetic system. Vis Neurosci 17:207–215PubMedGoogle Scholar
  4. Gershon ED, Wiener MC, Latham PE, Richmond BJ (1998) Coding strategies in monkey V1 and inferior temporal cortices. J Neurophysiol 79:1135–1144PubMedGoogle Scholar
  5. Ince RAA, Mazzoni A, Petersen RS, Panzeri S (2010) Open source tools for the information theoretic analysis of neural data. Front Neurosci 4:62–70PubMedCentralPubMedGoogle Scholar
  6. Ince RAA, Mazzoni A, Bartels A, Logothetis NK, Panzeri S (2012) A novel test to determine the significance of neural selectivity to single and multiple potentially correlated stimulus features. J Neurosci Methods 210:49–65PubMedGoogle Scholar
  7. Kraskov A, Stögbauer H, Grassberger P (2004) Estimating mutual information. Phys Rev E 69:66138Google Scholar
  8. Magri C, Whittingstall K, Singh V, Logothetis NK, Panzeri S (2009) A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings. BMC Neurosci 10:81PubMedCentralPubMedGoogle Scholar
  9. Miller G (1955) Note on the bias of information estimates. Inf Theory Psychol Probl Methods 2:95–100Google Scholar
  10. Montani F, Kohn A, Smith MA, Schultz SR (2007) The role of correlations in direction and contrast coding in the primary visual cortex. J Neurosci 27:2338PubMedGoogle Scholar
  11. Montemurro MA, Senatore R, Panzeri S (2007) Tight data-robust bounds to mutual information combining shuffling and model selection techniques. Neural Comput 19:2913–2957PubMedGoogle Scholar
  12. Moon YI, Rajagopalan B, Lall U (1995) Estimation of mutual information using kernel density estimators. Phys Rev E 52:2318Google Scholar
  13. Nemenman I, Shafee F, Bialek W (2002) Entropy and inference, revisited. In: Advances neural information process system. Proceedings of the 2002 science conference, Vancouver, Canada, vol 14, pp 95–100Google Scholar
  14. Nemenman I, Bialek W, van Steveninck R d R (2004) Entropy and information in neural spike trains: progress on the sampling problem. Phys Rev E 69:56111Google Scholar
  15. Nemenman I, Lewen GD, Bialek W, van Steveninck R d R (2008) Neural coding of natural stimuli: information at sub-millisecond resolution. PLoS Comput Biol 4:e1000025PubMedCentralPubMedGoogle Scholar
  16. Nirenberg S, Carcieri SM, Jacobs AL, Latham PE (2001) Retinal ganglion cells act largely as independent encoders. Nature 411:698–701PubMedGoogle Scholar
  17. Paninski L (2004) Estimating entropy on bins given fewer than samples. IEEE Trans Inf Theory 50:2201Google Scholar
  18. Panzeri S, Schultz SR (2001) A unified approach to the study of temporal, correlational, and rate coding. Neural Comput 13:1311–1349PubMedGoogle Scholar
  19. Panzeri S, Treves A (1996) Analytical estimates of limited sampling biases in different information measures. Netw Comput Neural Syst 7:87–107Google Scholar
  20. Panzeri S, Schultz SR, Treves A, Rolls ET (1999) Correlations and the encoding of information in the nervous system. Proc Biol Sci 266:1001–1012PubMedCentralPubMedGoogle Scholar
  21. Panzeri S, Senatore R, Montemurro MA, Petersen RS (2007) Correcting for the sampling bias problem in spike train information measures. J Neurophysiol 98:1064–1072PubMedGoogle Scholar
  22. Pillow JW, Paninski L, Uzzell VJ, Simoncelli EP, Chichilnisky E (2005) Prediction and decoding of retinal ganglion cell responses with a probabilistic spiking model. J Neurosci 25:11003–11013PubMedGoogle Scholar
  23. Pillow JW, Shlens J, Paninski L, Sher A, Litke AM, Chichilnisky E, Simoncelli EP (2008) Spatio-temporal correlations and visual signalling in a complete neuronal population. Nature 454:995–999PubMedCentralPubMedGoogle Scholar
  24. Pola G, Thiele A, Hoffmann KP, Panzeri S (2003) An exact method to quantify the information transmitted by different mechanisms of correlational coding. Netw Comput Neural Syst 14:35–60Google Scholar
  25. Pola G, Petersen RS, Thiele A, Young MP, Panzeri S (2005) Data-robust tight lower bounds to the information carried by spike times of a neuronal population. Neural Comput 17:1962–2005PubMedGoogle Scholar
  26. Roudi Y, Aurell E, Hertz J (2009) Statistical physics of pairwise probability models. Arxiv Prepr ArXiv:09051410Google Scholar
  27. Scaglione A, Foffani G, Scannella G, Cerutti S, Moxon K (2008) Mutual information expansion for studying the role of correlations in population codes: how important are autocorrelations? Neural Comput 20:2662–2695PubMedGoogle Scholar
  28. Scaglione A, Moxon KA, Foffani G (2010) General Poisson exact breakdown of the mutual information to study the role of correlations in populations of neurons. Neural Comput 22:1445–1467PubMedGoogle Scholar
  29. Schaub MT, Schultz SR (2012) The Ising decoder: reading out the activity of large neural ensembles. J Comput Neurosci 32:101–118PubMedGoogle Scholar
  30. Schneidman E, Berry MJ II, Segev R, Bialek W (2006) Weak pairwise correlations imply strongly correlated network states in a neural population. Nature 440:1007–1012PubMedCentralPubMedGoogle Scholar
  31. Schultz SR, Panzeri S (2001) Temporal correlations and neural spike train entropy. Phys Rev Lett 86:5823–5826PubMedGoogle Scholar
  32. Schultz SR, Rolls ET (1999) Analysis of information transmission in the Schaffer collaterals. Hippocampus 9:582–598PubMedGoogle Scholar
  33. Schultz S, Treves A (1998) Stability of the replica-symmetric solution for the information conveyed by a neural network. Phys Rev E 57:3302–3310Google Scholar
  34. Schultz SR, Kitamura K, Post-Uiterweer A, Krupic J, Häusser M (2009) Spatial pattern coding of sensory information by climbing fiber-evoked calcium signals in networks of neighboring cerebellar Purkinje cells. J Neurosci 29:8005–8015PubMedGoogle Scholar
  35. Sharpee T, Rust NC, Bialek W (2004) Analyzing neural responses to natural signals: maximally informative dimensions. Neural Comput 16:223–250PubMedGoogle Scholar
  36. Shlens J, Field GD, Gauthier JL, Grivich MI, Petrusca D, Sher A, Litke AM, Chichilnisky EJ (2006) The structure of multi-neuron firing patterns in primate retina. J Neurosci 26:8254–8266PubMedGoogle Scholar
  37. Skaggs WE, McNaughton BL, Gothard KM (1993) An information-theoretic approach to deciphering the hippocampal code. In: Advances in neural information processing systems, vol 5 [NIPS conference]. Morgan Kaufmann Publishers, San Francisco, pp 1030–1037. http://dl.acm.org/citation.cfm?id=645753.668057. Accessed 17 Jan 2014
  38. Strong SP, Koberle R, de Ruyter van Steveninck RR, Bialek W (1998) Entropy and information in neural spike trains. Phys Rev Lett 80:197–200Google Scholar
  39. Treves A, Panzeri S (1995) The upward bias in measures of information derived from limited data samples. Neural Comput 7:399–407Google Scholar
  40. Victor JD (2002) Binless strategies for estimation of information from neural data. Exp Brain Res Phys Rev E 66:051903Google Scholar
  41. Victor J (2006) Approaches to information-theoretic analysis of neural activity. Biol Theory 1:302–316PubMedCentralPubMedGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  • Robin A. A. Ince
    • 1
  • Simon R. Schultz
    • 2
  • Stefano Panzeri
    • 3
    • 4
  1. 1.School of Psychology, Institute of Neuroscience and PsychologyUniversity of GlasgowGlasgowUK
  2. 2.Department of BioengineeringImperial College LondonLondonUK
  3. 3.Center for Neuroscience and Cognitive SystemsIstituto Italiano di TecnologiaRovereto (Tn)Italy
  4. 4.Institute of Neuroscience and PsychologyUniversity of GlasgowGlasgowUK