20 Years of Computational Neuroscience

Volume 9 of the series Springer Series in Computational Neuroscience pp 73-102


Learning from the Past: Approaches for Reproducibility in Computational Neuroscience

  • Sharon M. CrookAffiliated withSchool of Mathematical and Statistical Sciences and School of Life Sciences, Center for Adaptive Neural Systems, Arizona State University Email author 
  • , Andrew P. DavisonAffiliated withUnité de Neuroscience Information et Complexité (UNIC), CNRS
  • , Hans E. PlesserAffiliated withDepartment of Mathematical Sciences and Technology, Norwegian University of Life Sciences

* Final gross prices may vary according to local VAT.

Get Access


Reproducible experiments are the cornerstone of science: only observations that can be independently confirmed enter the body of scientific knowledge. Computational science should excel in reproducibility, as simulations on digital computers avoid many of the small variations that are beyond the control of the experimental biologist or physicist. However, in reality, computational science has its own challenges for reproducibility: many computational scientists find it difficult to reproduce results published in the literature, and many authors have met problems replicating even the figures in their own papers. We present a distinction between different levels of replicability and reproducibility of findings in computational neuroscience. We also demonstrate that simulations of neural models can be highly sensitive to numerical details, and conclude that often it is futile to expect exact replicability of simulation results across simulator software packages. Thus, the computational neuroscience community needs to discuss how to define successful reproduction of simulation studies. Any investigation of failures to reproduce published results will benefit significantly from the ability to track the provenance of the original results. We present tools and best practices developed over the past 2 decades that facilitate provenance tracking and model sharing.