Living Reference Work Entry

Encyclopedia of Computational Neuroscience

pp 1-6

Date: Latest Version

Summary of Information Theoretic Quantities

  • Robin A. A. InceAffiliated withSchool of Psychology, Institute of Neuroscience and Psychology, University of Glasgow Email author 
  • , Stefano PanzeriAffiliated withSchool of Psychology, Institute of Neuroscience and Psychology, University of GlasgowCenter for Neuroscience and Cognitive Systems, Italian Institute of Technology
  • , Simon R. SchultzAffiliated withDepartment of Bioengineering, Imperial College London

Definition

Information theory is a practical and theoretic framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. As a framework, it has a number of useful properties: it provides a general measure sensitive to any relationship, not only linear effects; its quantities have meaningful units which, in many cases, allow a direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single experimental trials rather than in averages over multiple trials. A variety of information theoretic quantities are in common use in neuroscience – including the Shannon entropy, Kullback–Leibler divergence, and mutual information. In this entry, we introduce and define these quantities. Further details on how these quantities can be estimated in pract ...

This is an excerpt from the content