Information Theory and Systems Neuroscience

  • Don H. Johnson
  • Ilan N. Goodman
  • Christopher J. Rozell
Part of the Springer Series in Computational Neuroscience book series (NEUROSCI, volume 7)


Information theory reveals the performance limits of communication and signal processing systems, the brain being an interesting example. However, applying this powerful theory to neural signals has many pitfalls. The problem areas are discussed and we describe how to resolve the issues. In addition, we describe modern information theoretic results pertinent to neuroscience.


Mutual Information Spike Train Rate Code Neural Code Distortion Measure 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Bialek W, Rieke F, de Ruyter van Steveninck RR, Warland D (1991) Reading a neural code. Science 252:1852–1856 CrossRefGoogle Scholar
  2. Borst A, Theunissen FE (1999) Information theory and neural coding. Nature Neurosci 2:947–957 CrossRefPubMedGoogle Scholar
  3. Brémaud P (1981) Point processes and queues. Springer, New York Google Scholar
  4. Brenner N, Strong SP, Koberle R, Bialek W, de Ruyter van Steveninck RR (2000) Synergy in a neural code. Neural Comput 12:1531–1552 CrossRefPubMedGoogle Scholar
  5. Cover TM, Thomas JA (2006) Elements of information theory, 2nd edn. Wiley, New York Google Scholar
  6. Frey MR (1991) Information capacity of the Poisson channel. IEEE Trans Inform Theory 37:244–256 CrossRefGoogle Scholar
  7. Gastpar M, Rimoldi B, Vetterli M (2003) To code, or not to code: lossy source-channel communication revisited. IEEE Trans Inform Theory 49:1147–1158 CrossRefGoogle Scholar
  8. Johnson DH (1996) Point process models of single-neuron discharges. J Comp Neurosci 3:275–299 CrossRefGoogle Scholar
  9. Johnson DH (2002) Four top reasons mutual information does not quantify neural information processing. In: Computational neuroscience ’02, Chicago Google Scholar
  10. Johnson DH (2004) Neural population structures and consequences for neural coding. J Comp Neurosci 16:69–80 CrossRefGoogle Scholar
  11. Johnson DH, Goodman IN (2008) Inferring the capacity of the vector Poisson channel with a Bernoulli model. Network Comput Neural Syst 19:13–33 CrossRefGoogle Scholar
  12. Johnson DH, Gruner CM, Baggerly K, Seshagiri C (2001) Information-theoretic analysis of neural coding. J Comp Neurosci 10:47–69 CrossRefGoogle Scholar
  13. Kabanov YM (1978) The capacity of a channel of the Poisson type. Theory Probab Appl 23:143–147 CrossRefGoogle Scholar
  14. Latham PE, Nirenberg S (2005) Synergy redundancy, and independence in population codes, revisited. J Neurosci 25:5195–5206 CrossRefPubMedGoogle Scholar
  15. Massey JL (1990) Causality feedback and directed information. In: Proc. 1990 intl. sym. on information theory and its applications, Waikiki Google Scholar
  16. McFadden JA (1965) The entropy of a point process. SIAM J Appl Math 13:988–994 CrossRefGoogle Scholar
  17. Paninski L (2003) Estimation of entropy and mutual information. Neural Comput 15:1191–1253 CrossRefGoogle Scholar
  18. Panzeri S, Schultz SR (2001) A unified approach to the study of temporal, correlational, and rate coding. Neural Comput 13:1311–1349 CrossRefPubMedGoogle Scholar
  19. Panzeri S, Schultz SR, Treves A, Rolls ET (1999) Correlations and the encoding of information in the nervous system. Proc R Soc Lond 266:1001–1012 CrossRefGoogle Scholar
  20. Reich DS, Mechler F, Victor JD (2001) Independent and redundant information in nearby cortical neurons. Science 294:2566–2568 CrossRefPubMedGoogle Scholar
  21. Rozell CJ, Johnson DH, Glantz RM (2004) Measuring information transfer in crayfish sustaining fiber spike generators: methods and analysis. Biol Cybernet 90:89–97 CrossRefGoogle Scholar
  22. Schneidman E, Bialek W, Berry MJ II (2003) Synergy, redundancy, and independence in population codes. J Neurosci 23:11539–11553 PubMedGoogle Scholar
  23. Shamai S, Verdú S (1997) Empirical distribution for good codes. IEEE Trans Inform Theory 43:836–846 CrossRefGoogle Scholar
  24. Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27:379–423, 623–656 Google Scholar
  25. Sinanović S, Johnson DH (2007) Toward a theory of information processing. Signal Process 87:1326–1344 CrossRefGoogle Scholar
  26. Stein RB (1967) The information capacity of nerve cells using a frequency code. Biophys J 7:67–82 Google Scholar
  27. Strong SP, Koberle R, de Ruyter van Steveninck RR, Bialek W (1998) Entropy and information in neural spike trains. Phys Rev Lett 80:197–200 CrossRefGoogle Scholar
  28. Venkataramanan R, Pradhan SS (2005) Directed information for communication problems with common side information and delayed feedback/feedforward. In: Proc. Allerton conference on communication, control and computing Google Scholar
  29. Victor JD (2006) Approaches to information-theoretic analysis of neural activity. Biol Theory 1:302–316 CrossRefPubMedGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  • Don H. Johnson
    • 1
  • Ilan N. Goodman
    • 1
  • Christopher J. Rozell
    • 2
  1. 1.Department of Electrical and Computer EngineeringRice UniversityHoustonUSA
  2. 2.School of Electrical and Computer EngineeringGeorgia Institute of TechnologyAtlantaUSA

Personalised recommendations