Advertisement

Biological Cybernetics

, Volume 99, Issue 4–5, pp 241–251 | Cite as

Automated neuron model optimization techniques: a review

  • W. Van Geit
  • E. De Schutter
  • P. Achard
Review

Abstract

The increase in complexity of computational neuron models makes the hand tuning of model parameters more difficult than ever. Fortunately, the parallel increase in computer power allows scientists to automate this tuning. Optimization algorithms need two essential components. The first one is a function that measures the difference between the output of the model with a given set of parameter and the data. This error function or fitness function makes the ranking of different parameter sets possible. The second component is a search algorithm that explores the parameter space to find the best parameter set in a minimal amount of time. In this review we distinguish three types of error functions: feature-based ones, point-by-point comparison of voltage traces and multi-objective functions. We then detail several popular search algorithms, including brute-force methods, simulated annealing, genetic algorithms, evolution strategies, differential evolution and particle-swarm optimization. Last, we shortly describe Neurofitter, a free software package that combines a phase–plane trajectory density fitness function with several search algorithms.

Keywords

Optimization Neuron Model Parameters Automated tuning Error function Fitness function 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Achard P, De Schutter E (2006) Complex parameter landscape for a complex neuron model. PLoS Comput Biol 2: e94CrossRefPubMedGoogle Scholar
  2. Audet C, Orban D (2004) Finding optimal algorithmic parameters using a mesh adaptive direct search. Cahiers du GERAD G-2004-xxGoogle Scholar
  3. Baldi P, Vanier MC, Bower JM (1998) On the use of Bayesian methods for evaluating compartmental neural models. J Comput Neurosci 5: 285–314CrossRefPubMedGoogle Scholar
  4. Banga JR, Moles CG, Alonso AA (2003) Global optimization of bioprocesses using stochastic and hybrid methods. In: Floudas CA, Pardalos PM (eds) Frontiers in global optimization. Nonconvex optimization and its applications. Kluwer, Dordrecht, pp 45–70Google Scholar
  5. Bhalla US, Bower JM (1993) Exploring parameter space in detailed single neuron models: simulations of the mitral and granule cells of the olfactory bulb. J Neurophysiol 69: 1948–1965PubMedGoogle Scholar
  6. Bower JM, Beeman D (1998) The book of GENESIS exploring realistic neural models with the GEneral NEural SImulation System, 2nd edn. Springer, New YorkGoogle Scholar
  7. Broyden CG (1967) Quasi-Newton methods and their application to function minimisation. Math Comput 21: 368–381CrossRefGoogle Scholar
  8. Bush K, Knight J, Anderson C (2005) Optimizing conductance parameters of cortical neural models via electrotonic partitions. Neural Netw 18: 488–496CrossRefPubMedGoogle Scholar
  9. Butera RJ, Rinzel J, Smith JC (1999) Models of respiratory rhythm generation in the pre-Bötzinger complex. I. Bursting pacemaker neurons. J Neurophysiol 82: 382–397PubMedGoogle Scholar
  10. Cohon JL (1985) Multicriteria programming: brief review and application. In: Gero JS (eds) Design optimization. Academic Press, New YorkGoogle Scholar
  11. Davison AP, Feng J, Brown D (2000) A reduced compartmental model of the mitral cell for use in network models of the olfactory bulb. Brain Res Bull 51: 393–399CrossRefPubMedGoogle Scholar
  12. Druckmann S, Banitt Y, Gideon A, Schurmann F, Markram H, Segev I (2007) A novel multiple objective optimization framework for automated constraining of conductance-based neuron models by noisy experimental data. Front Neurosci 1: 7–18CrossRefPubMedGoogle Scholar
  13. Eiben AE, Schippers CA (1998) On evolutionary exploration and exploitation. Fundam Inf 35: 35–50Google Scholar
  14. Eiben AE, Smith JE (2003) Introduction to evolutionary computing. Springer, HeidelbergGoogle Scholar
  15. Fonseca CM, Fleming PJ (1993) Genetic algorithms for multiobjective optimization: formulation, discussion and generalization. Genetic algorithms: proceedings of the fifth international conference, pp 416–23Google Scholar
  16. Foster WR, Ungar LH, Schwaber JS (1993) Significance of conductances in Hodgkin–Huxley models. J Neurophysiol 70: 2502–2518PubMedGoogle Scholar
  17. Gerken WC, Purvis LK, Butera RJ (2006) Genetic algorithm for optimization and specification of a neuron model. Neurocomputing 69: 1039–1042CrossRefGoogle Scholar
  18. Goldberg DE, Richardson J (1987) Genetic algorithms with sharing for multimodal function optimization. Proceedings of the second international conference on genetic algorithms on genetic algorithms and their application, pp 41–9Google Scholar
  19. Goldberg DE, Voessner S (1999) Optimizing global-local search hybrids. Proc Gen Evol Comput Conf 1: 220–228Google Scholar
  20. Goldman MS, Golowasch J, Marder E, Abbott LF (2001) Global structure, robustness, and modulation of neuronal models. J Neurosci 21: 5229–5238PubMedGoogle Scholar
  21. Golowasch J, Goldman MS, Abbott LF, Marder E (2002) Failure of averaging in the construction of a conductance-based neuron model. J Neurophysiol 87: 1129–1131PubMedGoogle Scholar
  22. Handl J, Kell DB, Knowles J (2007) Multiobjective optimization in bioinformatics and computational biology. IEEE/ACM Trans Comput Biol Bioinform 4: 279–292CrossRefPubMedGoogle Scholar
  23. Haufler D, Morin F, Lacaille JC, Skinner FK (2007) Parameter estimation in single-compartment neuron models using a synchronization-based method. Neurocomputing 70: 1605–1610CrossRefGoogle Scholar
  24. Hines ML, Carnevale NT (1997) The NEURON simulation environment. Neural Comput 9: 1179–1209CrossRefPubMedGoogle Scholar
  25. Holland JH (1975) Adaptation in natural and artificial systems: an introductory analysis with applications to biology, control, and artificial intelligence. University of Michigan Press, Ann ArborGoogle Scholar
  26. Holmes W, Ambros-Ingerson J, Grover L (2006) Fitting experimental data to models that use morphological data from public databases. J Comput Neurosci 20: 349–365CrossRefPubMedGoogle Scholar
  27. Horn J, Nafpliotis N, Goldberg DE (1994) A Niched Pareto genetic algorithm for multiobjective optimization. Proc First IEEE Conf Evol Comput IEEE World Cong Comput Intell 1: 82–87Google Scholar
  28. Huys QJ, Ahrens MB, Paninski L (2006) Efficient estimation of detailed single-neuron models. J Neurophysiol 96: 872–890CrossRefPubMedGoogle Scholar
  29. Kennedy J, Eberhart RC (1995) Particle swarm optimization. Proceedings of the IEEE international joint conference on neural networks, pp 1942–948Google Scholar
  30. Keren N, Peled N, Korngreen A (2005) Constraining compartmental models using multiple voltage recordings and genetic algorithms. J Neurophysiol 94: 3730–3742CrossRefPubMedGoogle Scholar
  31. Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by simulated annealing. Science 220(4598): 671–680CrossRefPubMedGoogle Scholar
  32. LeMasson M (2001) Introduction to equation solving and parameter fitting. In: Computational neuroscience: realistic modeling for experimentalists. CRC Press, London, pp 1–3Google Scholar
  33. Lewicki MS (1998) A review of methods for spike sorting: the detection and classification of neural action potentials. Network 9: R53–R78CrossRefPubMedGoogle Scholar
  34. Nourani Y, Andresen B (1998) A comparison of simulated annealing cooling strategies. J Phys Math General 31: 8373–8385CrossRefGoogle Scholar
  35. Olypher AV, Calabrese RL (2007) Using constraints on neuronal activity to reveal compensatory changes in neuronal parameters. J Neurophysiol 98: 3749–3758CrossRefPubMedGoogle Scholar
  36. Pettinen A, Yli-Harja O, Linne M (2006) Comparison of automated parameter estimation methods for neuronal signaling networks. Neurocomputing 69: 1371–1374CrossRefGoogle Scholar
  37. Potter MA, De Jong K (1994) A cooperative coevolutionary approach to function optimization. Parallel Problem Solving Nature (PPSN) III: 249–257Google Scholar
  38. Price K, Storn RM, Lampinen JA (2005a) Differential evolution: a practical approach to global optimization (Natural computing series). Springer, New YorkGoogle Scholar
  39. Price K, Storn RM, Lampinen JA (2005b) The motivation for differential evolution. In: Differential evolution: a practical approach to global optimization (Natural Computing Series). Springer, Berlin, pp 1–6Google Scholar
  40. Prinz AA, Billimoria CP, Marder E (2003) Alternative to hand-tuning conductance-based models: construction and analysis of databases of model neurons. J Neurophysiol 90: 3998–4015CrossRefPubMedGoogle Scholar
  41. Rechenberg I (1973) Evolutionsstrategie: optimierung technischer systeme nach prinzipien der biologischen evolution. Frommann-Holzboog, StuttgartGoogle Scholar
  42. Reid MS, Brown EA, DeWeerth SP (2007) A parameter-space search algorithm tested on a Hodgkin–Huxley model. Biol Cybern 96: 625–634CrossRefPubMedGoogle Scholar
  43. Srinivas N, Deb K (1994) Multiobjective optimization using nondominated sorting in genetic algorithms. Evol Comput 2: 221–248CrossRefGoogle Scholar
  44. Swensen AM, Bean BP (2005) Robustness of burst firing in dissociated purkinje neurons with acute or long-term reductions in sodium conductance. J Neurosci 25: 3509–3520CrossRefPubMedGoogle Scholar
  45. Tabak J, Murphey CR, Moore LE (2000) Parameter estimation methods for single neuron models. J Comput Neurosci 9: 215–236CrossRefPubMedGoogle Scholar
  46. Taylor AL, Hickey TJ, Prinz AA, Marder E (2006) Structure and visualization of high-dimensional conductance spaces. J Neurophysiol 96: 891–905CrossRefPubMedGoogle Scholar
  47. Tobin AE, Calabrese RL (2006) Endogenous and half-center bursting in morphologically inspired models of leech heart interneurons. J Neurophysiol 96: 2089–2106CrossRefPubMedGoogle Scholar
  48. Van Geit W, Achard P, De Schutter E (2007) Neurofitter: a parameter tuning package for a wide range of electrophysiological neuron models. Front Neuroinformatics 1: 1PubMedGoogle Scholar
  49. Vanier MC, Bower JM (1999) A comparative survey of automated parameter-search methods for compartmental neural models. J Comput Neurosci 7: 149–171CrossRefPubMedGoogle Scholar
  50. Weaver CM, Wearne SL (2006) The role of action potential shape and parameter constraints in optimization of compartment models. Neurocomputing 69: 1053–1057CrossRefGoogle Scholar
  51. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1: 67–82CrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2008

Authors and Affiliations

  1. 1.Computational Neuroscience UnitOkinawa Institute of Science and TechnologyOkinawaJapan
  2. 2.Theoretical NeurobiologyUniversity of AntwerpAntwerpBelgium
  3. 3.Volen Center for Complex SystemBrandeis UniversityWalthamUSA

Personalised recommendations