Coordinate invariance as a fundamental constraint on the form of stimulus-specific information measures

Original Article

Abstract

The value of Shannon’s mutual information is commonly used to describe the total amount of information that the neural code transfers between the ensemble of stimuli and the ensemble of neural responses. In addition, it is often desirable to know which features of the stimulus or response are most informative. The literature offers several different decompositions of the mutual information into its stimulus or response-specific components, such as the specific surprise or the uncertainty reduction, but the number of mutually distinct measures is in fact infinite. We resolve this ambiguity by requiring the specific information measures to be invariant under invertible coordinate transformations of the stimulus and the response ensembles. We prove that the Kullback–Leibler divergence is then the only suitable measure of the specific information. On a more general level, we discuss the necessity and the fundamental aspects of the coordinate invariance as a selection principle. We believe that our results will encourage further research into invariant statistical methods for the analysis of neural coding.

Keywords

Coordinate invariance Mutual information Specific information Kullback–Leibler divergence 

References

  1. Atick JJ (1992) Could information theory provide an ecological theory of sensory processing? Netw Comput Neural Syst 3(2):213–251CrossRefGoogle Scholar
  2. Attwell D, Laughlin SB (2001) An energy budget for signaling in the grey matter of the brain. J Cereb Blood Flow Metab 21(10):1133–1145CrossRefPubMedGoogle Scholar
  3. Barlow HB (1961) Possible principles underlying the transformation of sensory messages. In: Rosenblith W (ed) Sensory communication. MIT Press, Cambridge, pp 217–234Google Scholar
  4. Berger MS (1977) Nonlinearity and functional analysis. Academic Press, New YorkGoogle Scholar
  5. Berger T, Levy WB (2010) A mathematical theory of energy efficient neural computation and communication. IEEE Trans Inf Theory 56(2):852–874CrossRefGoogle Scholar
  6. Bezzi M (2007) Quantifying the information transmitted in a single stimulus. BioSystems 89:4–9CrossRefPubMedGoogle Scholar
  7. Borst A, Theunissen FE (1999) Information theory and neural coding. Nature Neurosci 2:947–958CrossRefPubMedGoogle Scholar
  8. Butts DA (2003) How much information is associated with a particular stimulus? Netw Comput Neural Syst 14:177–187CrossRefGoogle Scholar
  9. Butts DA, Goldman MS (2006) Tuning curves, neuronal variability, and sensory coding. PLoS Biol 4(4):e92CrossRefPubMedPubMedCentralGoogle Scholar
  10. Cox DR, Lewis PAW (1966) The statistical analysis of series of events. Latimer Trend and Co., Ltd., WhistableCrossRefGoogle Scholar
  11. Csiszár I, Körner J (1981) Information theory: coding theory for discrete memoryless systems. Academic Press, New YorkGoogle Scholar
  12. Dayan P, Abbott LF (2001) Theoretical neuroscience: computational and mathematical modeling of neural systems. MIT Press, CambridgeGoogle Scholar
  13. DeWeese MR, Meister M (1999) How to measure the information gained from one symbol. Netw Comput Neural Syst 10(4):325–340CrossRefGoogle Scholar
  14. Dimitrov AG, Miller JP (2001) Neural coding and decoding: communication channels and quantization. Netw Comput Neural Syst 12(4):441–472CrossRefGoogle Scholar
  15. Dimitrov AG, Lazar AL, Victor JD (2011) Information theory in neuroscience. J Comput Neurosci 30:1–5CrossRefPubMedPubMedCentralGoogle Scholar
  16. Dirac PAM (1958) The principles of quantum mechanics. Oxford University Press, New YorkGoogle Scholar
  17. Dubuis JO, Tkačik G, Wieschaus EF, Gregor T, Bialek W (2013) Positional information, in bits. Proc Natl Acad Sci USA 110(41):16,301–16,308CrossRefGoogle Scholar
  18. Fano RM (1961) Transmission of information: a statistical theory of communications. MIT Press, New YorkGoogle Scholar
  19. Frank SA (2013) Input-output relations in biological systems: measurement, information and the Hill equation. Biol Direct 8:13CrossRefGoogle Scholar
  20. Gallager RG (1968) Information theory and reliable communication. Wiley, New YorkGoogle Scholar
  21. Han YM, Chan YS, Lo KS, Wong TM (1998) Spontaneous activity and barosensitivity of the barosensitive neurons in the rostral ventrolateral medulla of hypertensive rats induced by transection of aortic depressor nerves. Brain Res 813(2):262–7CrossRefPubMedGoogle Scholar
  22. Harris JJ, Jolivet R, Attwell D (2012) Synaptic energy use and supply. Neuron 75(5):762–777CrossRefPubMedGoogle Scholar
  23. Harris JJ, Jolivet R, Engl E, Attwell D (2015) Energy-efficient information transfer by visual pathway synapses. Curr Biol 25(24):3151–3160CrossRefPubMedPubMedCentralGoogle Scholar
  24. Ikeda S, Manton JH (2009) Capacity of a single spiking neuron channel. Neural Comput 21(6):1714–1748CrossRefPubMedGoogle Scholar
  25. Johnson DH (2010) Information theory and neural information processing. IEEE Trans Inf Theory 56(2):653–666CrossRefGoogle Scholar
  26. Kaissling KE, Rospars JP (2004) Dose-response relationships in an olfactory flux detector model revisited. Chem Senses 29(6):529–531CrossRefPubMedGoogle Scholar
  27. Kastner DB, Baccus SA (2011) Coordinated dynamic encoding in the retina using opposing forms of plasticity. Nat Neurosci 14(10):1317–1322CrossRefPubMedPubMedCentralGoogle Scholar
  28. Kobayashi R, Tsubo Y, Lansky P, Shinomoto S (2011) Estimating time-varying input signals and ion channel states from a single voltage trace of a neuron. In: Shawe-Taylor J, Zemel RS, Bartlett P, Pereira FCN, Weinberger KQ (eds) Advances in neural information processing systems (NIPS), vol 24. MIT Press, Cambridge, pp 217–225Google Scholar
  29. Kostal L (2016) Stimulus reference frame and neural coding precision. J Math Psychol 71:22–27CrossRefGoogle Scholar
  30. Kostal L, Kobayashi R (2015) Optimal decoding and information transmission in Hodgkin-Huxley neurons under metabolic cost constraints. Biosystems 136:3–10CrossRefPubMedGoogle Scholar
  31. Kostal L, Lansky P (2015) Coding accuracy is not fully determined by the neuronal model. Neural Comput 27(5):1051–1057CrossRefPubMedGoogle Scholar
  32. Kostal L, Lansky P (2016) Coding accuracy on the psychophysical scale. Sci Rep 6(23):810Google Scholar
  33. Kostal L, Lansky P, Rospars JP (2008) Efficient olfactory coding in the pheromone receptor neuron of a moth. PLoS Comput Biol 4(e1000):053Google Scholar
  34. Kostal L, Lansky P, McDonnell MD (2013) Metabolic cost of neuronal information in an empirical stimulus–response model. Biol Cybern 107(3):355–365CrossRefPubMedGoogle Scholar
  35. Kraskov A, Stögbauer H, Grassberger P (2004) Estimating mutual information. Phys Rev E 69(6):66,138–16CrossRefGoogle Scholar
  36. Kullback S (1968) Information theory and statistics. Dover, New YorkGoogle Scholar
  37. Lansky P, Rodriguez R, Sacerdote L (2004) Mean instantaneous firing frequency is always higher than the firing rate. Neural Comput 16(3):477–489CrossRefPubMedGoogle Scholar
  38. Lansky P, Pokora O, Rospars JP (2008) Classification of stimuli based on stimulus–response curves and their variability. Brain Res 1225:57–66CrossRefPubMedGoogle Scholar
  39. Lansky P, Sacerdote L, Zucca C (2016) The Gamma renewal process as an output of the diffusion leaky integrate-and-fire neuronal model. Biol Cybern 110:193–200CrossRefPubMedGoogle Scholar
  40. Laughlin SB (1981) A simple coding procedure enhances a neuron’s information capacity. Z Naturforsch 36(9–10):910–912Google Scholar
  41. Levakova M, Tamborrino M, Kostal L, Lansky P (2016) Presynaptic spontaneous activity enhances the accuracy of latency coding. Neural Comput 28(10):2162–2180CrossRefPubMedGoogle Scholar
  42. Lu T, Wang X (2004) Information content of auditory cortical responses to time-varying acoustic stimuli. J Neurophysiol 91:301–313CrossRefPubMedGoogle Scholar
  43. Luenberger DG (1969) Optimization by vector space methods. Wiley, New YorkGoogle Scholar
  44. Massey JL (1990) Causality, feedback and directed information. In: Proceedings of the 1990 international symposium on information theory and its applications, Waikiki, Hawaii, pp 27–30Google Scholar
  45. McDonnell MD, Stocks NG (2008) Maximally informative stimuli and tuning curves for sigmoidal rate-coding neurons and populations. Phys Rev Lett 101(5):058,103CrossRefGoogle Scholar
  46. McDonnell MD, Ikeda S, Manton JH (2011) An introductory review of information theory in the context of computational neuroscience. Biol Cybern 105:55–70CrossRefPubMedGoogle Scholar
  47. Misner CW, Thorne KS, Wheeler JA (1973) Gravitation. W. H. Freeman, San FranciscoGoogle Scholar
  48. Olypher AV, Lansky P, Muller RU, Fenton AA (2003) Quantifying location-specific information in the discharge of rat hippocampal place cells. J Neurosci Methods 127:123–135CrossRefPubMedGoogle Scholar
  49. Papoulis A (1991) Probability, random variables, and stochastic processes. McGraw-Hill, New YorkGoogle Scholar
  50. Perkel DH, Bullock TH (1968) Neural coding. Neurosci Res Prog Sum 3:405–527Google Scholar
  51. Polyanskiy Y, Poor VH, Verdu S (2010) Channel coding rate in the finite blocklength regime. IEEE Trans Inf Theory 56(5):2307–2359CrossRefGoogle Scholar
  52. Raichel DR (2006) The science and applications of acoustics. Springer, New YorkGoogle Scholar
  53. Rolls ET, Treves A, Tovee MJ, Panzeri S (1997) Information in the neuronal representation of individual stimuli in the primate temporal visual cortex. J Comput Neurosci 4:309–333CrossRefPubMedGoogle Scholar
  54. Sachs MB, Abbas PJ (1976) Phenomenological model for two-tone suppresion. J Acoust Soc Am 60:1157–1163CrossRefGoogle Scholar
  55. Schreiber T (2000) Measuring Information Transfer. Phys Rev Lett 85(2):461–464CrossRefPubMedGoogle Scholar
  56. Shadlen MN, Newsome WT (1998) The variable discharge of cortical neurons: Implications for connectivity, computation, and information coding. J Neurosci 18(10):3870–3896PubMedGoogle Scholar
  57. Smith JG (1971) The information capacity of amplitude-and variance-constrained scalar gaussian channels. Inform Control 18(3):203–219CrossRefGoogle Scholar
  58. Stein RB (1967) The information capacity of nerve cells using a frequency code. Biophys J 7(6):797–826CrossRefPubMedPubMedCentralGoogle Scholar
  59. Stein RB, Gossen ER, Jones KE (2005) Neuronal variability: noise or part of the signal? Nat Rev Neurosci 6(5):389–397CrossRefPubMedGoogle Scholar
  60. Suksompong P, Berger T (2010) Capacity analysis for integrate-and-fire neurons with descending action potential thresholds. IEEE Trans Inf Theory 56(2):838–851CrossRefGoogle Scholar
  61. Sun JZ, Wang GI, Goyal VK, Varshney LR (2012) A framework for Bayesian optimality of psychophysical laws. J Math Psychol 56:495–501CrossRefGoogle Scholar
  62. Teich MC, Khanna SM (1985) Pulse-number distribution for the neural spike train in the cat’s auditory nerve. J Acoustic Soc Am 77:1110–1128Google Scholar
  63. Theunissen F, Miller JP (1995) Temporal encoding in nervous systems: a rigorous definition. J Comput Neurosci 2(2):149–162CrossRefPubMedGoogle Scholar
  64. Thomson EE, Kristan WB (2005) Quantifying stimulus discriminability: a comparison of information theory and ideal observer analysis. Neural Comput 17:741–778CrossRefPubMedGoogle Scholar
  65. Tuckwell HC (1988) Introduction to theoretical neurobiology, vol 2. Cambridge University Press, New YorkCrossRefGoogle Scholar
  66. Wehr M (2013) Stimulus-specific information. In: Jaeger Dieter, Jung Ranu (eds) Encyclopedia of computational neuroscience. Springer, New York, pp 1–4Google Scholar
  67. Wen B, Wang GI, Dean I, Delgutte B (2009) Dynamic range adaptation to sound level statistics in the auditory nerve. J Neurosci 29(44):13,797–13,808CrossRefGoogle Scholar
  68. Wibral M, Lizier JT, Priesemann V (2015) Bits from brains for biologically inspired computing. Front Robot AI 2:5CrossRefGoogle Scholar
  69. Winslow RL, Sachs MB (1988) Single-tone intensity discrimination based on auditory-nerve rate responses in background of quiet, noise, and with stimulation of the crossed olivocochlear bundle. Hearing Res 35:165–190CrossRefGoogle Scholar
  70. Wyner AD (1978) A definition of conditional mutual information for arbitrary ensembles. Inf Control 38(1):51–59CrossRefGoogle Scholar
  71. Yarrow S, Challis E, Seriès P (2012) Fisher and Shannon information in finite neural populations. Neural Comput 24(7):1740–1780CrossRefPubMedGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany 2017

Authors and Affiliations

  1. 1.Institute of PhysiologyCzech Academy of SciencesPrague 4Czech Republic

Personalised recommendations