Skip to main content

Including Gap Junctions into Distributed Neuronal Network Simulations

Part of the Lecture Notes in Computer Science book series (LNTCS,volume 10087)

Abstract

Contemporary simulation technology for neuronal networks enables the simulation of brain-scale networks using neuron models with a single or a few compartments. However, distributed simulations at full cell density are still lacking the electrical coupling between cells via so called gap junctions. This is due to the absence of efficient algorithms to simulate gap junctions on large parallel computers. The difficulty is that gap junctions require an instantaneous interaction between the coupled neurons, whereas the efficiency of simulation codes for spiking neurons relies on delayed communication. In a recent paper [15] we describe a technology to overcome this obstacle. Here, we give an overview of the challenges to include gap junctions into a distributed simulation scheme for neuronal networks and present an implementation of the new technology available in the NEural Simulation Tool (NEST 2.10.0). Subsequently we introduce the usage of gap junctions in model scripts as well as benchmarks assessing the performance and overhead of the technology on the supercomputers JUQUEEN and K computer.

Keywords

  • Computational neuroscience
  • Spiking neuronal network
  • Gap junctions
  • Waveform relaxation
  • Supercomputer
  • Large-scale simulation

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-319-50862-7_4
  • Chapter length: 15 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   44.99
Price excludes VAT (USA)
  • ISBN: 978-3-319-50862-7
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   59.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.
Fig. 3.

References

  1. Albada, S.J., Kunkel, S., Morrison, A., Diesmann, M.: Integrating brain structure and dynamics on supercomputers. In: Grandinetti, L., Lippert, T., Petkov, N. (eds.) BrainComp 2013. LNCS, vol. 8603, pp. 22–32. Springer, Heidelberg (2014). doi:10.1007/978-3-319-12084-3_3

    Google Scholar 

  2. Amit, D.J.: Modeling Brain Function. Cambridge University Press, Cambridge (1989)

    CrossRef  MATH  Google Scholar 

  3. Bos, H., Morrison, A., Peyser, A., Hahne, J., Helias, M., Kunkel, S., Ippen, T., Eppler, J.M., Schmidt, M., Seeholzer, A., Djurfeldt, M., Diaz, S., Morén, J., Deepu, R., Stocco, T., Deger, M., Michler, F., Plesser, H.E.: NEST 2.10.0 (Dec 2015). http://dx.doi.org/10.5281/zenodo.44222

  4. Bressloff, P.C.: Spatiotemporal dynamics of continuum neural fields. J. Phys. A: Math. Theor. 45(3), 33001 (2012). http://iopscience.iop.org/1751-8121/45/3/033001

    MathSciNet  CrossRef  Google Scholar 

  5. Brunel, N.: Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons. J. Comput. Neurosci. 8(3), 183–208 (2000)

    CrossRef  MATH  Google Scholar 

  6. Buice, M.A., Cowan, J.D., Chow, C.C.: Systematic fluctuation expansion for neural network activity equations. Neural Comput. 22, 377–426 (2009)

    MathSciNet  CrossRef  MATH  Google Scholar 

  7. Byrne, G.D., Hindmarsh, A.C.: PVODE, an ODE solver for parallel computers. Int. J. High Perform. Comput. Appl. 13(4), 354–365 (1999). http://hpc.sagepub.com/content/13/4/354.short; http://acts.nersc.gov/sundials/documents/ucrl-jc-132361.pdf

    CrossRef  Google Scholar 

  8. Connors, B.W., Long, M.A.: Electrical synapses in the mammalian brain. Annu. Rev. Neurosci. 27(1), 393–418 (2004)

    CrossRef  Google Scholar 

  9. Dere, E., Zlomuzica, A.: The role of gap junctions in the brain in health and disease. Neurosci. Biobehav. Rev. 36, 206–217 (2011)

    CrossRef  Google Scholar 

  10. Eppler, J.M., Helias, M., Muller, E., Diesmann, M., Gewaltig, M.: PyNEST: a convenient interface to the NEST simulator. Front. Neuroinformatics 2, 12 (2009)

    Google Scholar 

  11. Gewaltig, M.O., Diesmann, M.: NEST (NEural Simulation Tool). Scholarpedia 2(4), 1430 (2007)

    CrossRef  Google Scholar 

  12. Ginzburg, I., Sompolinsky, H.: Theory of correlations in stochastic neural networks. Phys. Rev. E 50(4), 3171–3191 (1994)

    CrossRef  Google Scholar 

  13. Glauber, R.: Time-dependent statistics of the Ising model. J. Math. Phys. 4(2), 294–307 (1963)

    MathSciNet  CrossRef  MATH  Google Scholar 

  14. Grytskyy, D., Tetzlaff, T., Diesmann, M., Helias, M.: A unified view on weakly correlated recurrent networks. Front. Comput. Neurosci. 7, 131 (2013)

    CrossRef  Google Scholar 

  15. Hahne, J., Helias, M., Kunkel, S., Igarashi, J., Bolten, M., Frommer, A., Diesmann, M.: A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations. Front. Neuroinform. 9, 22 (2015)

    CrossRef  Google Scholar 

  16. Hansel, D., Mato, G., Pfeuty, B.: The role of intrinsic cell properties in synchrony of neurons interacting via electrical synapses. In: Schultheiss, N.W., Prinz, A.A., Butera, R.J. (eds.) Phase Response Curves in Neuroscience. SSCN, vol. 6, pp. 361–398. Springer, Heidelberg (2012). doi:10.1007/978-1-4614-0739-3_15

    CrossRef  Google Scholar 

  17. Herculano-Houzel, S.: The human brain in numbers: a linearly scaled-up primate brain. Front. Hum. Neurosci. 3, 31 (2009)

    CrossRef  Google Scholar 

  18. Hertz, J., Krogh, A., Palmer, R.G.: Introduction to the Theory of Neural Computation. Perseus Books, New York (1991)

    Google Scholar 

  19. Hindmarsh, A.C., Brown, P.N., Grant, K.E., Lee, S.L., Serban, R., Shumaker, D.E., Woodward, C.S.: Sundials: Suite of nonlinear and differential/algebraic equation solvers. ACM Trans. Math. Softw. 31(3), 363–396 (2005). http://dl.acm.org/citation.cfm?doid=1089014.1089020

    MathSciNet  CrossRef  MATH  Google Scholar 

  20. Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA 79, 2554–2558 (1982)

    MathSciNet  CrossRef  Google Scholar 

  21. Hormuzdi, S., Filippov, M., Mitropoulou, G., Monyer, H., Bruzzone, R.: Electrical synapses: a dynamic signaling system that shapes the activity of neuronal networks. Biochim. Biophys. Acta 1662, 113–137 (2004)

    CrossRef  Google Scholar 

  22. Jülich Supercomputing Centre: JUQUEEN: IBM Blue Gene/Q\(^{\textregistered }\) supercomputer system at the Jülich Supercomputing Centre. J. Large-scale Res. Facil. 1 (2015). http://dx.doi.org/10.17815/jlsrf-1-18

  23. Kunkel, S., Schmidt, M., Eppler, J.M., Masumoto, G., Igarashi, J., Ishii, S., Fukai, T., Morrison, A., Diesmann, M., Helias, M.: Spiking network simulation code for petascale computers. Front. Neuroinform. 8, 78 (2014)

    CrossRef  Google Scholar 

  24. Lelarasmee, E.: The waveform relaxation method for time domain analysis of large scale integrated circuits: theory and applications. Memorandum p. No. UCB/ERL M82/40. (1982)

    Google Scholar 

  25. Morrison, A., Mehring, C., Geisel, T., Aertsen, A., Diesmann, M.: Advancing the boundaries of high connectivity network simulation with distributed computing. Neural Comput. 17(8), 1776–1801 (2005)

    CrossRef  MATH  Google Scholar 

Download references

Acknowledgements

We gratefully acknowledge the NEST core team for an in-depth discussion of the user interface and Mitsuhisa Sato for hosting our activities at RIKEN AICS. Computing time on the K computer was provided through early access in the framework of the co-development program, project hp130120 of the General Use Category (2013), the Strategic Program (project hp150236, Neural Computation Unit, OIST), and MEXT SPIRE Supercomputational Life Science. The authors gratefully acknowledge the computing time on the supercomputer JUQUEEN [22] at Forschungszentrum Jülich granted by JARA-HPC Vergabegremium (provided on the JARA-HPC partition, jinb33) and Gauss Centre for Supercomputing (GCS) (provided by John von Neumann Institute for Computing (NIC) on GCS share, hwu12). Partly supported by Helmholtz Portfolio Supercomputing and Modeling for the Human Brain (SMHB), the Initiative and Networking Fund of the Helmholtz Association, the Helmholtz young investigator group VH-NG-1028, the Next-Generation Supercomputer Project of MEXT, and EU grant agreement No 720270 (HBP SGA1). All network simulations carried out with NEST (http://www.nest-simulator.org).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jan Hahne .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Hahne, J. et al. (2016). Including Gap Junctions into Distributed Neuronal Network Simulations. In: Amunts, K., Grandinetti, L., Lippert, T., Petkov, N. (eds) Brain-Inspired Computing. BrainComp 2015. Lecture Notes in Computer Science(), vol 10087. Springer, Cham. https://doi.org/10.1007/978-3-319-50862-7_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-50862-7_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-50861-0

  • Online ISBN: 978-3-319-50862-7

  • eBook Packages: Computer ScienceComputer Science (R0)