Abstract
An important question in neural information processing is how neurons cooperate to transmit information. To study this question, we resort to the concept of redundancy in the information transmitted by a group of neurons and, at the same time, we introduce a novel concept for measuring cooperation between pairs of neurons called relative mutual information (RMI). Specifically, we studied these two parameters for spike trains generated by neighboring neurons from the primary visual cortex in the awake, freely moving rat. The spike trains studied here were spontaneously generated in the cortical network, in the absence of visual stimulation. Under these conditions, our analysis revealed that while the value of RMI oscillated slightly around an average value, the redundancy exhibited a behavior characterized by a higher variability. We conjecture that this combination of approximately constant RMI and greater variable redundancy makes information transmission more resistant to noise disturbances. Furthermore, the redundancy values suggest that neurons can cooperate in a flexible way during information transmission. This mostly occurs via a leading neuron with higher transmission rate or, less frequently, through the information rate of the whole group being higher than the sum of the individual information rates—in other words in a synergetic manner. The proposed method applies not only to the stationary, but also to locally stationary neural signals.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Amigó JM, Kennel MB (2006) Variance estimators for the Lempel–Ziv entropy rate estimator. Chaos 16: 043102
Amigó JM, Szczepanski J, Wajnryb E, Sanchez-Vives MV (2004) Estimating the entropy rate of spike trains via Lempel–Ziv complexity. Neural Comput 16: 717–736
Ash RB (1965) Information theory. Wiley, New York
Barlow H (2001) Redundancy reduction revisited. Netw Comput Neural Syst 12(3): 241–253
Borst A, Theunissen FE (1999) Information theory and neural coding. Nat Neurosci 2: 947–957
Brenner N, Strong SP, Koberle R, Bialek W, de Ruyter van Steveninck R (2000) Synergy in neural code. Neural Comput 12: 1531–1552
Cover TM, Thomas JA (1991) Elements of information theory. Wiley, New York
De Polavieja GG, Harsch A, Kleppe I, Robinson HP, Juusola M (2005) Stimulus history reliably shapes action potential waveforms of cortical neurons. J Neurosci 25: 5657–5665
Durrett R (1996) Probability: theory and examples. Duxbury Press, Belmont
Fukunaga K (1972) Introduction to statistical pattern recognition. Academic Press, New York
Gawne TJ, Richmond BJ (1993) How independent are the messages carried by adjacent inferior temporal cortical neurons?. J Neurosci 13: 2758–2771
Kennel MB, Shlens J, Abarbanel HDI, Chichilnisky EJ (2005) Estimating entropy rates with Bayesian confidence intervals. Neural Comput 17: 1531–1576
Kontoyiannis I, Algoet PH, Suhov YM, Wyner AJ (1998) Nonparametric entropy estimation for stationary processes and random fields, with applications to English text. IEEE Trans Inf Theory 44: 1319–1327
Latham PE, Nirenberg S (2005) Synergy, redundancy and independence in population codes, revisited. J Neurosci 25: 5195–5206
Lempel A, Ziv J (1976) On the complexity of individual sequence. IEEE Trans Inf Theory IT-22: 75–88
London M, Schreibman A, Hausser M, Larkum ME, Segev I (2002) The information efficacy of a synapse. Nat Neurosci 5: 332–340
London M, Larkum ME, Hausser M (2008) Predicting the synaptic information efficacy in cortical layer 5 pyramidal neurons using a minimal integrate-and-fire model. Biol Cybern 99: 393–401
Machens CK, Stemmler MB, Prinz P, Krahe R, Ronacher B, Herz AV (2001) Representation of acoustic communication signals by insect auditory receptor neurons. J Neurosci 21: 3215–3227
Mahalanobis PC (1936) On the generalised distance in statistics. Proc Natl Inst Sci India 2: 49–55
Paninski L (2003) Estimation of entropy and mutual information. Neural Comput 15(6): 1191–1253
Panzeri S, Schultz SR, Treves A, Rolls E (1999) Correlations and the encoding of information in the nervous system. Proc R Soc Lond B 266: 1001–1012
Paxinos G, Watson C (1998) The rat brain in stereotaxic coordinates. Academic Press, New York
Puchalla JL, Schneideman E, Harris RA, Berry MJ (2005) Redundancy in the population code of the retina. Neuron 46: 493–504
Reich DS, Mechler F, Victor JD (2001) Independent and redundant information in nearby cortical neurons. Science 294: 2566–2568
Rieke F, Warland D, de Ruyter van Steveninck R, Bialek W (1997) Spikes: exploring the neural code. MIT Press, Boston
Rolls ET, Aggelopoulos NC, Franco L, Treves A (2004) Information encoding in the inferior temporal visual cortex: contributions of the firing rates and the correlations between the firing of neurons. Biol Cybern 90: 19–32
Schneidman E, Bialek W, Berry MJ (2003) Synergy, redundancy and independence in population codes. J Neurosci 23: 11539–11553
Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27:379–423, 623–656
Simoncelli EP, Olshausen BA (2001) Natural image statistics and neural representation. Annu Rev Neurosci 24: 1193–1216
Strong SP, Koberle R, de Ruyter van Steveninck R, Bialek W (1998) Entropy and information in neural spike trains. Phys Rev Lett 80: 197–200
Szczepanski J, Amigó JM, Wajnryb E, Sanchez-Vives MV (2003) Application of Lempel–Ziv complexity to the analysis of neural discharges. Netw Comput Neural Syst 14: 335–350
Weaver W, Shannon CE (1963) The mathematical theory of communication. University of Illinois Press, Urbana
Ziv J, Lempel A (1978) Compression of individual sequences via variable-rate coding. IEEE Trans Inf Theory IT-24: 530–536
Acknowledgments
This research was the result of a collaboration supported by the Spanish-Polish Scientific Cooperation Program CSIC-PAS, grant 17/R04/R05. Experimental research was supported by the Spanish Ministry of Science and Innovation and the EU Future and Emerging Technologies Program (PRESENCCIA project) to MVSV.
Open Access
This article is distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Open Access This is an open access article distributed under the terms of the Creative Commons Attribution Noncommercial License (https://creativecommons.org/licenses/by-nc/2.0), which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
About this article
Cite this article
Szczepanski, J., Arnold, M., Wajnryb, E. et al. Mutual information and redundancy in spontaneous communication between cortical neurons. Biol Cybern 104, 161–174 (2011). https://doi.org/10.1007/s00422-011-0425-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00422-011-0425-y