Open Systems & Information Dynamics

, Volume 2, Issue 1, pp 67–76 | Cite as

Multiscale modeling of developmental processes

  • David H. Sharp
  • John Reinitz
  • Eric Mjolsness


In contrast to most synthetic neural nets, biological neural networks have a strong component of genetic determination which acts before and during experiential learning. Three broad levels of phenomena are present: long-term evolution, involving crossover as well as point mutation; a developmental process mapping genetic information to a set of cells and their internal states of gene expression (genotype to phenotype); and the subsequent synaptogenesis. We describe a very simple mathematical idealization of these three levels which combines the crossover search method of genetic algorithms with the developmental models used in our previous work on “genetic” or “recursively generated” artificial neural nets [18] (and elaborated into a connectionist model of biological development [19]). Despite incorporating all three levels (evolution on genes; development of cells; synapse formation) the model may actually be far cheaper to compute with than a comparable search directly in synaptic weight space.


Genetic Algorithm Developmental Process Experiential Learning Internal State Developmental Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    H. J. Antonisse and K. S. Keller, Genetic operators for high-level knowledge representation, inProceedings of the Second International Conference on Genetic Algorithms and their Applications, Lawrence Erlbaum Associates, 1987.Google Scholar
  2. [2]
    E. Baum and D. Haussler, What size net gives valid generalization?, inNeural Information Processing Systems 1, p. 81, Morgan Kaufmann, 1989.Google Scholar
  3. [3]
    L. W. Buss,The Evolution of Individuality, Princeton University Press, 1987.Google Scholar
  4. [4]
    S. E. Fahlman and C. Lebiere, The cascade-correlation learning architecture, inNeural Information Processing Systems 2, pp. 523–532, Morgan Kaufmann, 1990.Google Scholar
  5. [5]
    J. D. Farmer, N. H. Packard, and A. S. Perelson, Physica D22, 187–204, 1986.Google Scholar
  6. [6]
    J. D. Farmer, Physica D42, 153–187, 1990.Google Scholar
  7. [7]
    S. Forrest, Ph.D. Thesis, The University of Michigan, Ann Arbor, MI., USA, 1975.Google Scholar
  8. [8]
    T. Goto, P. Macdonald, and T. Maniatis, Cell57, 413–422, 1989.Google Scholar
  9. [9]
    S. J. Gould,Ontogeny and Phylogeny, Harvard University Press, 1977.Google Scholar
  10. [10]
    S. J. Hanson, Meiosis networks, inNeural Information Processing Systems 2, pp. 533–541. Morgan Kaufmann, 1990.Google Scholar
  11. [11]
    K. Harding, T. Hoey, R. Warrior, and M. Levine, The EMBO Journal8, 1205–1212, 1989.Google Scholar
  12. [12]
    S. A. Harp, T. Samad, and A. Guha, The genetic synthesis of neural networks, Technical Report CSDD-89-I4852-2, Honeywell Corporate Systems Development Center, June 1989.Google Scholar
  13. [13]
    J. Holland,Escaping Brittleness: The Possibilities of General Purpose Algorithms Applied to Parallel Rule-Based Systems, chapter 20, Morgan Kaufmann, 1986.Google Scholar
  14. [14]
    J. Holland, K. J. Holyoak, R. E. Nisbett, and P. R. Thagard,Induction, MIT Press, 1989.Google Scholar
  15. [15]
    K. S. Lackner, V. D. Sandberg, and D. H. Sharp, Data processing at the SSC with structured neural nets, Technical Report LA-UR-90-3774, Los Alamos National Laboratory, October 1990.Google Scholar
  16. [16]
    G. F. Miller, P. M. Todd, and S. U. Hegde, Designing neural networks using genetic algorithms, inThird International Conference on Genetic Algorithms, pp. 379–384, 1989.Google Scholar
  17. [17]
    E. Mjolsness, C. Garrett, and A. Rangarajan, A neural net for reconstruction of multiple curves with a visual grammar, Submitted to the International Joint Conference on Neural Networks '91 — Seattle, January 1991.Google Scholar
  18. [18]
    E. Mjolsness, D. H. Sharp, and B. K. Alpert, Adv. Appl. Math.10, 137–163, 1989.Google Scholar
  19. [19]
    E. Mjolsness, D. H. Sharp, and J. Reinitz, J. Theor. Biol.152, 429–453, 1991.Google Scholar
  20. [20]
    A. S. Perelson, Immunolog. Rev.110, 1989.Google Scholar
  21. [21]
    L. Pick, A. Schier, M. Affolter, T. Schmidt-Glenewinkel, and W. J. Gehring, Genes and Development4, 1224–1239, 1990.Google Scholar
  22. [22]
    J. Reinitz, E. Mjolsness, and D. H. Sharp, A connectionist model of theDrosophila blastoderm, Technical Report LA-UR-90-3923, Los Alamos National Laboratory, November 1990.Google Scholar
  23. [23]
    J. Reinitz, E. Mjolsness, and D. H. Sharp, Model for cooperative control of positional information inDrosophila bybcd and maternalhb, Technical Report LA-UR-92-2942, Los Alamos National Laboratory, September 1992.Google Scholar
  24. [24]
    M. F. Tenorio and Wei-Tsih Lee, Self-organizing neural networks for the identification problem, inNeural Information Processing Systems 1, pp. 57–64, Morgan Kaufmann, 1988.Google Scholar

Copyright information

© Nicholas Copernicus University Press 1993

Authors and Affiliations

  • David H. Sharp
    • 1
  • John Reinitz
    • 2
  • Eric Mjolsness
    • 3
  1. 1.Theoretical DivisionLos Alamos National LaboratoryLos AlamosUSA
  2. 2.Center for Medical InformaticsYale UniversityNew HavenUSA
  3. 3.Department of Computer ScienceYale UniversityNew HavenUSA

Personalised recommendations