Abstract
Information theory reveals the performance limits of communication and signal processing systems, the brain being an interesting example. However, applying this powerful theory to neural signals has many pitfalls. The problem areas are discussed and we describe how to resolve the issues. In addition, we describe modern information theoretic results pertinent to neuroscience.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bialek W, Rieke F, de Ruyter van Steveninck RR, Warland D (1991) Reading a neural code. Science 252:1852–1856
Borst A, Theunissen FE (1999) Information theory and neural coding. Nature Neurosci 2:947–957
Brémaud P (1981) Point processes and queues. Springer, New York
Brenner N, Strong SP, Koberle R, Bialek W, de Ruyter van Steveninck RR (2000) Synergy in a neural code. Neural Comput 12:1531–1552
Cover TM, Thomas JA (2006) Elements of information theory, 2nd edn. Wiley, New York
Frey MR (1991) Information capacity of the Poisson channel. IEEE Trans Inform Theory 37:244–256
Gastpar M, Rimoldi B, Vetterli M (2003) To code, or not to code: lossy source-channel communication revisited. IEEE Trans Inform Theory 49:1147–1158
Johnson DH (1996) Point process models of single-neuron discharges. J Comp Neurosci 3:275–299
Johnson DH (2002) Four top reasons mutual information does not quantify neural information processing. In: Computational neuroscience ’02, Chicago
Johnson DH (2004) Neural population structures and consequences for neural coding. J Comp Neurosci 16:69–80
Johnson DH, Goodman IN (2008) Inferring the capacity of the vector Poisson channel with a Bernoulli model. Network Comput Neural Syst 19:13–33
Johnson DH, Gruner CM, Baggerly K, Seshagiri C (2001) Information-theoretic analysis of neural coding. J Comp Neurosci 10:47–69
Kabanov YM (1978) The capacity of a channel of the Poisson type. Theory Probab Appl 23:143–147
Latham PE, Nirenberg S (2005) Synergy redundancy, and independence in population codes, revisited. J Neurosci 25:5195–5206
Massey JL (1990) Causality feedback and directed information. In: Proc. 1990 intl. sym. on information theory and its applications, Waikiki
McFadden JA (1965) The entropy of a point process. SIAM J Appl Math 13:988–994
Paninski L (2003) Estimation of entropy and mutual information. Neural Comput 15:1191–1253
Panzeri S, Schultz SR (2001) A unified approach to the study of temporal, correlational, and rate coding. Neural Comput 13:1311–1349
Panzeri S, Schultz SR, Treves A, Rolls ET (1999) Correlations and the encoding of information in the nervous system. Proc R Soc Lond 266:1001–1012
Reich DS, Mechler F, Victor JD (2001) Independent and redundant information in nearby cortical neurons. Science 294:2566–2568
Rozell CJ, Johnson DH, Glantz RM (2004) Measuring information transfer in crayfish sustaining fiber spike generators: methods and analysis. Biol Cybernet 90:89–97
Schneidman E, Bialek W, Berry MJ II (2003) Synergy, redundancy, and independence in population codes. J Neurosci 23:11539–11553
Shamai S, Verdú S (1997) Empirical distribution for good codes. IEEE Trans Inform Theory 43:836–846
Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27:379–423, 623–656
Sinanović S, Johnson DH (2007) Toward a theory of information processing. Signal Process 87:1326–1344
Stein RB (1967) The information capacity of nerve cells using a frequency code. Biophys J 7:67–82
Strong SP, Koberle R, de Ruyter van Steveninck RR, Bialek W (1998) Entropy and information in neural spike trains. Phys Rev Lett 80:197–200
Venkataramanan R, Pradhan SS (2005) Directed information for communication problems with common side information and delayed feedback/feedforward. In: Proc. Allerton conference on communication, control and computing
Victor JD (2006) Approaches to information-theoretic analysis of neural activity. Biol Theory 1:302–316
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Johnson, D.H., Goodman, I.N., Rozell, C.J. (2010). Information Theory and Systems Neuroscience. In: Grün, S., Rotter, S. (eds) Analysis of Parallel Spike Trains. Springer Series in Computational Neuroscience, vol 7. Springer, Boston, MA. https://doi.org/10.1007/978-1-4419-5675-0_13
Download citation
DOI: https://doi.org/10.1007/978-1-4419-5675-0_13
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4419-5674-3
Online ISBN: 978-1-4419-5675-0
eBook Packages: Biomedical and Life SciencesBiomedical and Life Sciences (R0)