Abstract
Can information theory be used to understand neural signaling? Yes, but assumptions have to be made about the nature of that signaling. The traditional view is that the individual neural spike is an all-or-none phenomenon, which allows neural spikes to be viewed as discrete, binary pulses, similar in kind to the signals in digital computers. Under this assumption, the tools of information theory can be used to derive results about the properties of neural signals. However, new results from neuroscience demonstrate that the precise shape of the individual spike can be functionally significant, thus violating the assumption that spikes can always be treated as a binary pulse. Instead, spikes must sometimes be viewed as a continuous signal. Fortunately, information-theoretic tools exist for the study of continuous signals; unfortunately, their use in the continuous domain is very different from their use in the discrete domain, and not always well understood. Researchers interested in making precise claims about the nature of the information used, stored, and processed in neural systems must pay careful attention to these differences.
Similar content being viewed by others
Notes
What “objective” amounts to here will be spelled out further later; the objectivity of information becomes complicated when we move from discrete to continuous systems.
A number of textbooks on information theory provide the relevant details, such as Aczél and Daróczy (1975)
This is based on a similar concern about whether rocks implement any arbitrary finite automaton (Putnam 1988))
One of the first applications of information theory to neural firing is MacKay and McCulloch (1952). Surveys of these results and the specific techniques can be found in Rieke et al. (1997), chapters 1 and 4 of Dayan and Abbott (2005), Victor (2006), and Rolls and Treves (2011), and a special issue of the Journal of Computational Neuroscience (Dimitrov et al. 2011).
In English: as the number of symbols approaches infinity, so does the information entropy
Of course, if the probabilities were not equal, then there may be different ways of creating the partitions, which lowers the average number of choices, which in turn means the information entropy would be lower.
These examples are analogous to perfectly-balanced binary search trees in computer science.
In my opinion, “data” is a mass noun, not a count noun, thus making sense of terms like “data point” and “data set.”
Similar results have been known to occur in invertebrates for decades. For example, Shapiro et al. (1980) demonstrate that the release of neurotransmitter is a function of the waveform of neural spikes in sea slugs.
A post-synaptic neuron is a neuron “downstream” from the neuron in question. When discussing neural firing, it is conventional to call the neuron generating a spike the pre-synaptic neuron, and the one that will be affected by this spike after it reaches the synapse the post-synaptic neuron.
This is, of course, how these systems are designed: the continuously-varying voltage is meant to realize logical elements that only take one of two values.
References
Aczél, J., and Z. Daróczy. 1975. On Measures of Information and Their Characterizations. New York: Academic Press.
Alle, H., and J.R.P. Geiger. 2006. Combined analog and action potential coding in hippocampal mossy fibers. Science 311: 1290–1293.
Baddeley, R. 2000. Introductory information theory and the brain. Information theory and the brain, eds. Baddeley R., Hancock P., and Földiák P. Cambridge, Cambridge University Press.
Baker, A. 2005. Are there genuine mathematical explanations of physical phenomena? Mind 114(454): 223–238.
Bialowas, A., S. Rama, M. Zbili, V. Marra, L. Fronzaroli Molinieres, N. Ankri, E. Carlier, and D. Debanne. 2015. Analog modulation of spike-evoked transmission in CA3 circuits is determined by axonal Kv1.1 channels in a time-dependent manner. European Journal of Neuroscience 41(3): 293–304.
Cao, R. 2014. Signaling in the brain: in search of functional units. Philosophy of Science 81(5): 891–901.
Chalmers, D.J. 1996. Does a rock implement every finite-state automaton? Philosophy of Science 108(3): 309–333.
Dayan, P., and L.F. Abbott. 2005. Theoretical Neuroscience. Cambridge: MIT Press.
DiCaprio, R.A. 2004. Information transfer rate of nonspiking afferent neurons in the crab. Journal of Neurophysiology 92: 302–310.
Dimitrov, A.G., A.A. Lazar, and J.D. Victor. 2011. Information theory in neuroscience. Journal of Computational Neuroscience 30(1): 1–5.
Dretske, F. 1981. Knowledge and the Flow of Information. Cambridge: MIT Press.
Gerstner, W., W.M. Kistler, R. Naud, and L. Paninski. 2014. Neuronal Dynamics. From Single neurons to networks and models of cognition. Cambridge University Press: Cambridge.
Ihara, S. 1993. Information Theory for Continuous Systems. World Scientific: River Edge.
Jaynes, E.T. 1963. Information theory and statistical mechanics. Statistical Physics, ed. Ford K. New York, W. A. Benjamin, Inc.
MacKay, D.M., and W.S. McCulloch. 1952. The limiting information capacity of a neuronal link. The Bulletin of Mathematical Biophysics 14(2): 127–135.
Maley, C.J. 2018. Toward analog neural computation. Minds and Machines 28(1): 77–91.
Park, I.M., S. Seth, A.R.C. Paiva, L. Li, and J.C. Príncipe. 2013. Kernel methods on spike train space for neuroscience: a tutorial. IEEE Signal Processing 30(4): 149–160.
Piccinini, G. 2007. Computational modelling vs. Computational explanation: Is everything a Turing Machine, and does it matter to the philosophy of mind. Australasian Journal of Philosophy 85(1): 93–115.
Piccinini, G., and A. Scarantino. 2010. Computation vs. information processing: why their difference matters to cognitive science. Studies In History and Philosophy of Science Part A 41(3): 237–246.
Putnam, H. 1988. Representation and Reality. Cambridge: MIT Press.
Rama, S., M. Zbili, and D. Debanne. 2015. Modulation of spike-evoked synaptic transmission: The role of presynaptic calcium and potassium channels. Biochimica et Biophysica Acta (BBA) - Molecular Cell Research 1853 (9): 1933–1939.
Rieke, F., D. Warland, R. De Ruyter van Steveninck, and W. Bialek. 1997. Spikes. Exploring the neural code. Cambridge: MIT Press.
Roberts, A., and B.M.H. Bush, (eds). 1981. Neurones without impulses. Their significance for vertebrate and invertebrate nervous systems. Cambridge, Cambridge University Press.
Rolls, E. T., and A. Treves. 2011. The neuronal encoding of information in the brain. Progress in Neurobiology 95(3): 448–490.
Romo, R., A. Hernandez, A. Zainos, C. Brody, and E. Salinas. 2002. Exploring The cortical evidence of a sensory-discrimination process. Philosophical Transactions Of The Royal Society Of London Series B-Biological Sciences, 357: 1039–1051.
Rowan, M.J.M., and J.M. Christie. 2017. Rapid state-dependent alteration in Kv3 channel availability drives flexible synaptic signaling dependent on somatic subthreshold depolarization. Cell Reports 18(8): 2018–2029.
Shannon, C.E. 1948. A mathematical theory of communication. The Bell System Technical Journal 27: 379–423.
Shapiro, E., V.F. Castellucci, and E.R. Kandel. 1980. Presynaptic membrane potential affects transmitter release in an identified neuron in Aplysia by modulating the Ca2+ and K+ currents. Proceedings of the National Academy of Sciences 77 (1): 629–633.
Shu, Y., A. Hasenstaub, A. Duque, Y. Yu, and D. A. Mccormick. 2006. Modulation of intracortical synaptic potentials by presynaptic somatic membrane potential 441(7094):761–765.
Strong, S.P., R. Koberle, R.R. De Ruyter van Steveninck, and W. Bialek. 1998. Entropy and Information in Neural Spike Trains. Physical Review Letters 80(1): 197–200.
VanRullen, R., R. Guyonneau, and S.J. Thorpe. 2005. Spike times make sense. Trends in Neurosciences 28(1): 1–4.
Victor, J.D. 2006. Approaches to Information-Theoretic analysis of neural activity. Biological theory 1(3): 302–316.
Acknowledgments
Thanks to Gualtiero Piccinini, Sarah Robins, and an anonymous reviewer for helpful suggestions.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Maley, C.J. Continuous Neural Spikes and Information Theory. Rev.Phil.Psych. 11, 647–667 (2020). https://doi.org/10.1007/s13164-018-0412-5
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13164-018-0412-5