Biological Cybernetics

, Volume 104, Issue 3, pp 161–174 | Cite as

Mutual information and redundancy in spontaneous communication between cortical neurons

  • J. Szczepanski
  • M. Arnold
  • E. Wajnryb
  • J. M. Amigó
  • M. V. Sanchez-Vives
Open Access
Original Paper


An important question in neural information processing is how neurons cooperate to transmit information. To study this question, we resort to the concept of redundancy in the information transmitted by a group of neurons and, at the same time, we introduce a novel concept for measuring cooperation between pairs of neurons called relative mutual information (RMI). Specifically, we studied these two parameters for spike trains generated by neighboring neurons from the primary visual cortex in the awake, freely moving rat. The spike trains studied here were spontaneously generated in the cortical network, in the absence of visual stimulation. Under these conditions, our analysis revealed that while the value of RMI oscillated slightly around an average value, the redundancy exhibited a behavior characterized by a higher variability. We conjecture that this combination of approximately constant RMI and greater variable redundancy makes information transmission more resistant to noise disturbances. Furthermore, the redundancy values suggest that neurons can cooperate in a flexible way during information transmission. This mostly occurs via a leading neuron with higher transmission rate or, less frequently, through the information rate of the whole group being higher than the sum of the individual information rates—in other words in a synergetic manner. The proposed method applies not only to the stationary, but also to locally stationary neural signals.


Neurons Shannon information Entropy Mutual information Redundancy Visual cortex Spikes train Spontaneous activity 


  1. Amigó JM, Kennel MB (2006) Variance estimators for the Lempel–Ziv entropy rate estimator. Chaos 16: 043102PubMedCrossRefGoogle Scholar
  2. Amigó JM, Szczepanski J, Wajnryb E, Sanchez-Vives MV (2004) Estimating the entropy rate of spike trains via Lempel–Ziv complexity. Neural Comput 16: 717–736PubMedCrossRefGoogle Scholar
  3. Ash RB (1965) Information theory. Wiley, New YorkGoogle Scholar
  4. Barlow H (2001) Redundancy reduction revisited. Netw Comput Neural Syst 12(3): 241–253Google Scholar
  5. Borst A, Theunissen FE (1999) Information theory and neural coding. Nat Neurosci 2: 947–957PubMedCrossRefGoogle Scholar
  6. Brenner N, Strong SP, Koberle R, Bialek W, de Ruyter van Steveninck R (2000) Synergy in neural code. Neural Comput 12: 1531–1552PubMedCrossRefGoogle Scholar
  7. Cover TM, Thomas JA (1991) Elements of information theory. Wiley, New YorkCrossRefGoogle Scholar
  8. De Polavieja GG, Harsch A, Kleppe I, Robinson HP, Juusola M (2005) Stimulus history reliably shapes action potential waveforms of cortical neurons. J Neurosci 25: 5657–5665PubMedCrossRefGoogle Scholar
  9. Durrett R (1996) Probability: theory and examples. Duxbury Press, BelmontGoogle Scholar
  10. Fukunaga K (1972) Introduction to statistical pattern recognition. Academic Press, New YorkGoogle Scholar
  11. Gawne TJ, Richmond BJ (1993) How independent are the messages carried by adjacent inferior temporal cortical neurons?. J Neurosci 13: 2758–2771PubMedGoogle Scholar
  12. Kennel MB, Shlens J, Abarbanel HDI, Chichilnisky EJ (2005) Estimating entropy rates with Bayesian confidence intervals. Neural Comput 17: 1531–1576PubMedCrossRefGoogle Scholar
  13. Kontoyiannis I, Algoet PH, Suhov YM, Wyner AJ (1998) Nonparametric entropy estimation for stationary processes and random fields, with applications to English text. IEEE Trans Inf Theory 44: 1319–1327CrossRefGoogle Scholar
  14. Latham PE, Nirenberg S (2005) Synergy, redundancy and independence in population codes, revisited. J Neurosci 25: 5195–5206PubMedCrossRefGoogle Scholar
  15. Lempel A, Ziv J (1976) On the complexity of individual sequence. IEEE Trans Inf Theory IT-22: 75–88CrossRefGoogle Scholar
  16. London M, Schreibman A, Hausser M, Larkum ME, Segev I (2002) The information efficacy of a synapse. Nat Neurosci 5: 332–340PubMedCrossRefGoogle Scholar
  17. London M, Larkum ME, Hausser M (2008) Predicting the synaptic information efficacy in cortical layer 5 pyramidal neurons using a minimal integrate-and-fire model. Biol Cybern 99: 393–401PubMedCrossRefGoogle Scholar
  18. Machens CK, Stemmler MB, Prinz P, Krahe R, Ronacher B, Herz AV (2001) Representation of acoustic communication signals by insect auditory receptor neurons. J Neurosci 21: 3215–3227PubMedGoogle Scholar
  19. Mahalanobis PC (1936) On the generalised distance in statistics. Proc Natl Inst Sci India 2: 49–55Google Scholar
  20. Paninski L (2003) Estimation of entropy and mutual information. Neural Comput 15(6): 1191–1253CrossRefGoogle Scholar
  21. Panzeri S, Schultz SR, Treves A, Rolls E (1999) Correlations and the encoding of information in the nervous system. Proc R Soc Lond B 266: 1001–1012CrossRefGoogle Scholar
  22. Paxinos G, Watson C (1998) The rat brain in stereotaxic coordinates. Academic Press, New YorkGoogle Scholar
  23. Puchalla JL, Schneideman E, Harris RA, Berry MJ (2005) Redundancy in the population code of the retina. Neuron 46: 493–504PubMedCrossRefGoogle Scholar
  24. Reich DS, Mechler F, Victor JD (2001) Independent and redundant information in nearby cortical neurons. Science 294: 2566–2568PubMedCrossRefGoogle Scholar
  25. Rieke F, Warland D, de Ruyter van Steveninck R, Bialek W (1997) Spikes: exploring the neural code. MIT Press, BostonGoogle Scholar
  26. Rolls ET, Aggelopoulos NC, Franco L, Treves A (2004) Information encoding in the inferior temporal visual cortex: contributions of the firing rates and the correlations between the firing of neurons. Biol Cybern 90: 19–32PubMedCrossRefGoogle Scholar
  27. Schneidman E, Bialek W, Berry MJ (2003) Synergy, redundancy and independence in population codes. J Neurosci 23: 11539–11553PubMedGoogle Scholar
  28. Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27:379–423, 623–656Google Scholar
  29. Simoncelli EP, Olshausen BA (2001) Natural image statistics and neural representation. Annu Rev Neurosci 24: 1193–1216PubMedCrossRefGoogle Scholar
  30. Strong SP, Koberle R, de Ruyter van Steveninck R, Bialek W (1998) Entropy and information in neural spike trains. Phys Rev Lett 80: 197–200CrossRefGoogle Scholar
  31. Szczepanski J, Amigó JM, Wajnryb E, Sanchez-Vives MV (2003) Application of Lempel–Ziv complexity to the analysis of neural discharges. Netw Comput Neural Syst 14: 335–350CrossRefGoogle Scholar
  32. Weaver W, Shannon CE (1963) The mathematical theory of communication. University of Illinois Press, UrbanaGoogle Scholar
  33. Ziv J, Lempel A (1978) Compression of individual sequences via variable-rate coding. IEEE Trans Inf Theory IT-24: 530–536CrossRefGoogle Scholar

Copyright information

© The Author(s) 2011

Authors and Affiliations

  • J. Szczepanski
    • 1
    • 5
  • M. Arnold
    • 2
  • E. Wajnryb
    • 1
  • J. M. Amigó
    • 3
  • M. V. Sanchez-Vives
    • 4
  1. 1.Institute of Fundamental Technological ResearchWarsawPoland
  2. 2.Instituto de Neurociencias de AlicanteUniversidad Miguel Hernández-CSICSan Juan de AlicanteSpain
  3. 3.Centro de Investigación OperativaUniversidad Miguel HernándezElcheSpain
  4. 4.ICREA-Institut d’Investigacions Biomediques August Pi y Sunyer (IDIBAPS)BarcelonaSpain
  5. 5.Kazimierz Wielki UniversityBydgoszczPoland

Personalised recommendations