Encyclopedia of Computational Neuroscience

Living Edition
| Editors: Dieter Jaeger, Ranu Jung

Summary of Information Theoretic Quantities

  • Robin A. A. Ince
  • Stefano Panzeri
  • Simon R. Schultz
Living reference work entry
DOI: https://doi.org/10.1007/978-1-4614-7320-6_306-1


Information theory is a practical and theoretic framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. As a framework, it has a number of useful properties: it provides a general measure sensitive to any relationship, not only linear effects; its quantities have meaningful units which, in many cases, allow a direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single experimental trials rather than in averages over multiple trials. A variety of information theoretic quantities are in common use in neuroscience – including the Shannon entropy, Kullback–Leibler divergence, and mutual information. In this entry, we introduce and define these quantities. Further details on how these quantities can be estimated in...


Mutual Information Conditional Entropy Discrete Random Variable Leibler Divergence Differential Entropy 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in to check access.



Research supported by the SI-CODE (FET-Open, FP7-284533) project and by the ABC and NETT (People Programme Marie Curie Actions PITN-GA-2011-290011 and PITN-GA-2011-289146) projects of the European Union’s Seventh Framework Programme FP7 2007–2013.


  1. Bell AJ (2003) The co-information lattice. In: 4th International Symposium on independent component analysis blind signal separation ICA2003, Nara, pp 921–926Google Scholar
  2. Chakrabarti CG, Chakrabarty I (2005) Shannon entropy: axiomatic characterization and application. Int J Math Math Sci 2005:2847–2854CrossRefGoogle Scholar
  3. Cover TM, Thomas JA (1991) Elements of information theory. Wiley, New YorkCrossRefGoogle Scholar
  4. Fuglede B, Topsoe F (2004) Jensen-Shannon divergence and Hilbert space embedding. In: Proceedings of the international symposium on information theory, 2004 (ISIT 2004), p 31, Chicago, USAGoogle Scholar
  5. Green DM, Swets JA (1966) Signal detection theory and psychophysics. Wiley, New York. http://andrei.gorea.free.fr/Teaching_fichiers/SDT%20and%20Psytchophysics.pdf. Accessed 17 Jan 2014
  6. Ince RAA, Mazzoni A, Bartels A, Logothetis NK, Panzeri S (2012) A novel test to determine the significance of neural selectivity to single and multiple potentially correlated stimulus features. J Neurosci Methods 210:49–65PubMedCrossRefGoogle Scholar
  7. Jakulin A, Bratko I (2003) Quantifying and visualizing attribute interactions. arXiv:cs/0308002. http://arxiv.org/abs/cs/0308002. Accessed 13 Feb 2012
  8. Kennel MB, Shlens J, Abarbanel HDI, Chichilnisky E (2005) Estimating entropy rates with Bayesian confidence intervals. Neural Comput 17:1531–1576PubMedCrossRefGoogle Scholar
  9. Kullback S, Leibler RA (1951) On Information and Sufficiency. Ann Math Stat 22:79–86CrossRefGoogle Scholar
  10. Lin J (1991) Divergence measures based on the Shannon entropy. IEEE Trans Inf Theory 37:145–151CrossRefGoogle Scholar
  11. McGill WJ (1954) Multivariate information transmission. Psychometrika 19:97–116CrossRefGoogle Scholar
  12. Schneidman E, Berry MJ II, Segev R, Bialek W (2006) Weak pairwise correlations imply strongly correlated network states in a neural population. Nature 440:1007–1012PubMedCentralPubMedCrossRefGoogle Scholar
  13. Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27:379–423CrossRefGoogle Scholar
  14. Shannon CE, Weaver W (1949) The mathematical theory of communication. University of Illinois Press, Urbana, 19:1Google Scholar
  15. Shlens J, Kennel MB, Abarbanel HDI, Chichilnisky E (2007) Estimating information rates with confidence intervals in neural spike trains. Neural Comput 19:1683–1719PubMedCrossRefGoogle Scholar
  16. Strong SP, Koberle R, de Ruyter van Steveninck RR, Bialek W (1998) Entropy and information in neural spike trains. Phys Rev Lett 80:197–200CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  • Robin A. A. Ince
    • 1
  • Stefano Panzeri
    • 1
    • 2
  • Simon R. Schultz
    • 3
  1. 1.School of Psychology, Institute of Neuroscience and PsychologyUniversity of GlasgowGlasgowUK
  2. 2.Center for Neuroscience and Cognitive SystemsItalian Institute of TechnologyRoveretoItaly
  3. 3.Department of BioengineeringImperial College LondonLondonUK