Skip to main content

Advertisement

Log in

Continuous Neural Spikes and Information Theory

  • Published:
Review of Philosophy and Psychology Aims and scope Submit manuscript

Abstract

Can information theory be used to understand neural signaling? Yes, but assumptions have to be made about the nature of that signaling. The traditional view is that the individual neural spike is an all-or-none phenomenon, which allows neural spikes to be viewed as discrete, binary pulses, similar in kind to the signals in digital computers. Under this assumption, the tools of information theory can be used to derive results about the properties of neural signals. However, new results from neuroscience demonstrate that the precise shape of the individual spike can be functionally significant, thus violating the assumption that spikes can always be treated as a binary pulse. Instead, spikes must sometimes be viewed as a continuous signal. Fortunately, information-theoretic tools exist for the study of continuous signals; unfortunately, their use in the continuous domain is very different from their use in the discrete domain, and not always well understood. Researchers interested in making precise claims about the nature of the information used, stored, and processed in neural systems must pay careful attention to these differences.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. What “objective” amounts to here will be spelled out further later; the objectivity of information becomes complicated when we move from discrete to continuous systems.

  2. A number of textbooks on information theory provide the relevant details, such as Aczél and Daróczy (1975)

  3. This is based on a similar concern about whether rocks implement any arbitrary finite automaton (Putnam 1988))

  4. One of the first applications of information theory to neural firing is MacKay and McCulloch (1952). Surveys of these results and the specific techniques can be found in Rieke et al. (1997), chapters 1 and 4 of Dayan and Abbott (2005), Victor (2006), and Rolls and Treves (2011), and a special issue of the Journal of Computational Neuroscience (Dimitrov et al. 2011).

  5. In English: as the number of symbols approaches infinity, so does the information entropy

  6. Of course, if the probabilities were not equal, then there may be different ways of creating the partitions, which lowers the average number of choices, which in turn means the information entropy would be lower.

  7. These examples are analogous to perfectly-balanced binary search trees in computer science.

  8. With the examples above, we can subtract the differential entropy of Eq. 6 from that of Eq. 5, giving us 2.474. But \(\log _{2}\left (\frac {5}{0.9}\right ) = 2.474\). Thus, the difference of the entropies is just the base-2 logarithm of the quotient between the upper ends of the two PDFs.

  9. In my opinion, “data” is a mass noun, not a count noun, thus making sense of terms like “data point” and “data set.”

  10. Similar results have been known to occur in invertebrates for decades. For example, Shapiro et al. (1980) demonstrate that the release of neurotransmitter is a function of the waveform of neural spikes in sea slugs.

  11. A post-synaptic neuron is a neuron “downstream” from the neuron in question. When discussing neural firing, it is conventional to call the neuron generating a spike the pre-synaptic neuron, and the one that will be affected by this spike after it reaches the synapse the post-synaptic neuron.

  12. This is, of course, how these systems are designed: the continuously-varying voltage is meant to realize logical elements that only take one of two values.

References

  • Aczél, J., and Z. Daróczy. 1975. On Measures of Information and Their Characterizations. New York: Academic Press.

    Google Scholar 

  • Alle, H., and J.R.P. Geiger. 2006. Combined analog and action potential coding in hippocampal mossy fibers. Science 311: 1290–1293.

    Google Scholar 

  • Baddeley, R. 2000. Introductory information theory and the brain. Information theory and the brain, eds. Baddeley R., Hancock P., and Földiák P. Cambridge, Cambridge University Press.

  • Baker, A. 2005. Are there genuine mathematical explanations of physical phenomena? Mind 114(454): 223–238.

    Google Scholar 

  • Bialowas, A., S. Rama, M. Zbili, V. Marra, L. Fronzaroli Molinieres, N. Ankri, E. Carlier, and D. Debanne. 2015. Analog modulation of spike-evoked transmission in CA3 circuits is determined by axonal Kv1.1 channels in a time-dependent manner. European Journal of Neuroscience 41(3): 293–304.

    Google Scholar 

  • Cao, R. 2014. Signaling in the brain: in search of functional units. Philosophy of Science 81(5): 891–901.

    Google Scholar 

  • Chalmers, D.J. 1996. Does a rock implement every finite-state automaton? Philosophy of Science 108(3): 309–333.

    Google Scholar 

  • Dayan, P., and L.F. Abbott. 2005. Theoretical Neuroscience. Cambridge: MIT Press.

    Google Scholar 

  • DiCaprio, R.A. 2004. Information transfer rate of nonspiking afferent neurons in the crab. Journal of Neurophysiology 92: 302–310.

    Google Scholar 

  • Dimitrov, A.G., A.A. Lazar, and J.D. Victor. 2011. Information theory in neuroscience. Journal of Computational Neuroscience 30(1): 1–5.

    Google Scholar 

  • Dretske, F. 1981. Knowledge and the Flow of Information. Cambridge: MIT Press.

    Google Scholar 

  • Gerstner, W., W.M. Kistler, R. Naud, and L. Paninski. 2014. Neuronal Dynamics. From Single neurons to networks and models of cognition. Cambridge University Press: Cambridge.

    Google Scholar 

  • Ihara, S. 1993. Information Theory for Continuous Systems. World Scientific: River Edge.

    Google Scholar 

  • Jaynes, E.T. 1963. Information theory and statistical mechanics. Statistical Physics, ed. Ford K. New York, W. A. Benjamin, Inc.

  • MacKay, D.M., and W.S. McCulloch. 1952. The limiting information capacity of a neuronal link. The Bulletin of Mathematical Biophysics 14(2): 127–135.

    Google Scholar 

  • Maley, C.J. 2018. Toward analog neural computation. Minds and Machines 28(1): 77–91.

    Google Scholar 

  • Park, I.M., S. Seth, A.R.C. Paiva, L. Li, and J.C. Príncipe. 2013. Kernel methods on spike train space for neuroscience: a tutorial. IEEE Signal Processing 30(4): 149–160.

    Google Scholar 

  • Piccinini, G. 2007. Computational modelling vs. Computational explanation: Is everything a Turing Machine, and does it matter to the philosophy of mind. Australasian Journal of Philosophy 85(1): 93–115.

    Google Scholar 

  • Piccinini, G., and A. Scarantino. 2010. Computation vs. information processing: why their difference matters to cognitive science. Studies In History and Philosophy of Science Part A 41(3): 237–246.

    Google Scholar 

  • Putnam, H. 1988. Representation and Reality. Cambridge: MIT Press.

    Google Scholar 

  • Rama, S., M. Zbili, and D. Debanne. 2015. Modulation of spike-evoked synaptic transmission: The role of presynaptic calcium and potassium channels. Biochimica et Biophysica Acta (BBA) - Molecular Cell Research 1853 (9): 1933–1939.

    Google Scholar 

  • Rieke, F., D. Warland, R. De Ruyter van Steveninck, and W. Bialek. 1997. Spikes. Exploring the neural code. Cambridge: MIT Press.

    Google Scholar 

  • Roberts, A., and B.M.H. Bush, (eds). 1981. Neurones without impulses. Their significance for vertebrate and invertebrate nervous systems. Cambridge, Cambridge University Press.

    Google Scholar 

  • Rolls, E. T., and A. Treves. 2011. The neuronal encoding of information in the brain. Progress in Neurobiology 95(3): 448–490.

    Google Scholar 

  • Romo, R., A. Hernandez, A. Zainos, C. Brody, and E. Salinas. 2002. Exploring The cortical evidence of a sensory-discrimination process. Philosophical Transactions Of The Royal Society Of London Series B-Biological Sciences, 357: 1039–1051.

    Google Scholar 

  • Rowan, M.J.M., and J.M. Christie. 2017. Rapid state-dependent alteration in Kv3 channel availability drives flexible synaptic signaling dependent on somatic subthreshold depolarization. Cell Reports 18(8): 2018–2029.

    Google Scholar 

  • Shannon, C.E. 1948. A mathematical theory of communication. The Bell System Technical Journal 27: 379–423.

    Google Scholar 

  • Shapiro, E., V.F. Castellucci, and E.R. Kandel. 1980. Presynaptic membrane potential affects transmitter release in an identified neuron in Aplysia by modulating the Ca2+ and K+ currents. Proceedings of the National Academy of Sciences 77 (1): 629–633.

    Google Scholar 

  • Shu, Y., A. Hasenstaub, A. Duque, Y. Yu, and D. A. Mccormick. 2006. Modulation of intracortical synaptic potentials by presynaptic somatic membrane potential 441(7094):761–765.

  • Strong, S.P., R. Koberle, R.R. De Ruyter van Steveninck, and W. Bialek. 1998. Entropy and Information in Neural Spike Trains. Physical Review Letters 80(1): 197–200.

    Google Scholar 

  • VanRullen, R., R. Guyonneau, and S.J. Thorpe. 2005. Spike times make sense. Trends in Neurosciences 28(1): 1–4.

    Google Scholar 

  • Victor, J.D. 2006. Approaches to Information-Theoretic analysis of neural activity. Biological theory 1(3): 302–316.

    Google Scholar 

Download references

Acknowledgments

Thanks to Gualtiero Piccinini, Sarah Robins, and an anonymous reviewer for helpful suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Corey J. Maley.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Maley, C.J. Continuous Neural Spikes and Information Theory. Rev.Phil.Psych. 11, 647–667 (2020). https://doi.org/10.1007/s13164-018-0412-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13164-018-0412-5

Navigation