Skip to main content

Evolving Programs to Build Artificial Neural Networks

  • Chapter
  • First Online:
From Astrophysics to Unconventional Computation

Abstract

In general, the topology of Artificial Neural Networks (ANNs) is human-engineered and learning is merely the process of weight adjustment. However, it is well known that this can lead to sub-optimal solutions. Topology and Weight Evolving Artificial Neural Networks (TWEANNs) can lead to better topologies however, once obtained they remain fixed and cannot adapt to new problems. In this chapter, rather than evolving a fixed structure artificial neural network as in neuroevolution, we evolve a pair of programs that build the network. One program runs inside neurons and allows them to move, change, die or replicate. The other is executed inside dendrites and allows them to change length and weight, be removed, or replicate. The programs are represented and evolved using Cartesian Genetic Programming. From the developed networks multiple traditional ANNs can be extracted, each of which solves a different problem. The proposed approach has been evaluated on multiple classification problems.

Julian F. Miller—It is with great pleasure that I offer this article in honour of Susan Stepney’s 60th birthday. Susan has been a very stimulating and close colleague for many years.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://archive.ics.uci.edu/ml/datasets.html.

  2. 2.

    https://publikationen.bibliothek.kit.edu.

  3. 3.

    The paper gives a link to the detailed performance of the 179 classifiers which contain the figures given in the table.

  4. 4.

    http://www.biostathandbook.com/wilcoxonsignedrank.html.

  5. 5.

    http://www.real-statistics.com/statistics-tables/wilcoxon-signed-ranks-table/.

References

  1. Aljundi, R., Chakravarty, P., Tuytelaars, T.: Expert gate: lifelong learning with a network of experts 2 (2016). CoRR. arXiv:1611.06194

  2. Astor, J.C., Adami, C.: A development model for the evolution of artificial neural networks. Artificial Life 6, 189–218 (2000)

    Article  Google Scholar 

  3. Balaam, A.: Developmental neural networks for agents. In: Advances in Artificial Life, Proceedings of the 7th European Conference on Artificial Life (ECAL 2003), pp. 154–163. Springer (2003)

    Google Scholar 

  4. Belew, R.K.: Interposing an ontogenic model between genetic algorithms and neural networks. In: S.J. Hanson, J.D. Cowan, C.L. Giles (eds.) Advances in neural information processing systems NIPS5, pp. 99–106. Morgan Kaufmann (1993)

    Google Scholar 

  5. Boers, E.J.W., Kuiper, H.: Biological metaphors and the design of modular neural networks. Master’s thesis, Dept. of Computer Science and Dept. of Experimental and Theoretical Psychology, Leiden University (1992)

    Google Scholar 

  6. Cangelosi, A., Nolfi, S., Parisi, D.: Cell division and migration in a ‘genotype’ for neural networks. Netw.-Comput. Neural Syst. 5, 497–515 (1994)

    Article  Google Scholar 

  7. Deacon, T.: The Symbolic Species: The Co-evolution of Language and the Brain. W.W. Norton and Company, New York (1998)

    Google Scholar 

  8. Dekaban, A.S., Sadowsky, D.: Changes in brain weights during the span of human life. Ann. Neurol. 4, 345–356 (1978)

    Article  Google Scholar 

  9. Downing, K.L.: Supplementing evolutionary developmental systems with abstract models of neurogenesis. In: Proceedings of the Conference on Genetic and Evolutionary Computation, pp. 990–996 (2007)

    Google Scholar 

  10. Drchal, J., Šnorek, M.: Tree-based indirect encodings for evolutionary development of neural networks. In: Kůrková, V., Neruda, R., Koutník, J. (eds.) Artificial Neural Networks - ICANN, pp. 839–848. Springer, Berlin (2008)

    Google Scholar 

  11. Edelman, G., Tononi, G.: A Universe of Consciousness. Basic Books, New York (2000)

    Google Scholar 

  12. Eggenberger, P.: Creation of neural networks based on developmental and evolutionary principles. In: Gerstner, W., Germond, A., Hasler, M., Nicoud J.D. (eds.) Artificial Neural Networks — ICANN’97, pp. 337–342 (1997)

    Google Scholar 

  13. Fahlman, S.E., Lebiere, C.: The cascade-correlation learning architecture. In: Advances in Neural Information Processing Systems, pp. 524–532 (1990)

    Google Scholar 

  14. Federici, D.: A regenerating spiking neural network. Neural Netw. 18(5–6), 746–754 (2005)

    Article  Google Scholar 

  15. Fernández-Delgado, M., Cernadas, E., Barro, S., Amorim, D.: Do we need hundreds of classifiers to solve real world classification problems? J. Mach. Learn. Res. 15(1), 3133–3181 (2014)

    MathSciNet  MATH  Google Scholar 

  16. Ferreira, C.: Gene Expression Programming: Mathematical Modeling by an Artificial Intelligence, 2nd edn. Springer, New York (2006)

    MATH  Google Scholar 

  17. Floreano, D., Urzelai, J.: Neural morphogenesis, synaptic plasticity, and evolution. Theory Biosci. 120(3), 225–240 (2001)

    Article  Google Scholar 

  18. Franco, L., Jerez, J.M.: Constructive Neural Networks, vol. 258. Springer, Berlin (2009)

    Book  Google Scholar 

  19. French, R.M.: Catastrophic forgetting in connectionist networks: causes, consequences and solutions. Trends Cogn. Sci. 3(4), 128–135 (1999)

    Article  Google Scholar 

  20. Goldman, B.W., Punch, W.F.: Reducing wasted evaluations in cartesian genetic programming. In: Proceedings of the Genetic Programming: 16th European Conference, EuroGP 2013, Vienna, Austria, April 3–5, 2013, pp. 61–72. Springer, Berlin (2013)

    Google Scholar 

  21. Goldman, B.W., Punch, W.F.: Analysis of cartesian genetic programmings evolutionary mechanisms. IEEE Trans. Evol. Comput. 19, 359–373 (2015)

    Article  Google Scholar 

  22. Gruau, F.: Automatic definition of modular neural networks. Adapt. Behav. 3, 151–183 (1994)

    Article  Google Scholar 

  23. Gruau, F., Whitley, D., Pyeatt, L.: A comparison between cellular encoding and direct encoding for genetic neural networks. In: Proceedings of Conference on Genetic Programming, pp. 81–89 (1996)

    Google Scholar 

  24. Hampton, A.N., Adami, C.: Evolution of robust developmental neural networks. In: Pollack, J., Bedau, M.A., Husbands, P., Ikegami, T., Watson R. (eds.) Proceedings of Artificial Life IX, pp. 438–443 (2004)

    Google Scholar 

  25. Harding, S., Miller, J.F., Banzhaf, W.: Developments in cartesian genetic programming: self-modifying cgp. Genet. Program. Evolvable Mach. 11(3–4), 397–439 (2010)

    Article  Google Scholar 

  26. Hornby, G., Lipson, H., Pollack, J.B.: Generative representations for the automated design of modular physical robots. IEEE Trans. Robot. Autom. 19, 703–719 (2003)

    Article  Google Scholar 

  27. Hornby, G.S., Pollack, J.B.: Creating high-level components with a generative representation for body-brain evolution. Artif. Life 8(3) (2002)

    Google Scholar 

  28. Huizinga, J., Clune, J., Mouret, J.B.: Evolving neural networks that are both modular and regular: HyperNEAT plus the connection cost technique. In: Proceedings of the Conference on Genetic and Evolutionary Computation, pp. 697–704 (2014)

    Google Scholar 

  29. Isles, A.: Neural and behavioral epigenetics; what it Is, and what is hype. Wiley (2015)

    Google Scholar 

  30. Jakobi, N.: Harnessing morphogenesis, COGS Research Paper 423. University of Sussex, Technical report (1995)

    Google Scholar 

  31. Jung, S.Y.: A topographical method for the development of neural networks for artificial brain evolution. Artif. Life 11, 293–316 (2005)

    Article  Google Scholar 

  32. Kandel, E.R., Schwartz, J.H., Jessell, T.M.: Principles of Neural Science, 4th edn. McGraw-Hill, New York (2000)

    Google Scholar 

  33. Khan, G.M.: Evolution of Artificial Neural Development - In Search of Learning Genes. Studies in Computational Intelligence, vol. 725. Springer, Berlin (2018)

    Book  Google Scholar 

  34. Khan, G.M., Miller, J.F.: In search of intelligence: evolving a developmental neuron capable of learning. Connect. Sci. 26(4), 297–333 (2014)

    Article  Google Scholar 

  35. Khan, G.M., Miller, J.F., Halliday, D.M.: Evolution of cartesian genetic programs for development of learning neural architecture. Evol. Comput. 19(3), 469–523 (2011)

    Article  Google Scholar 

  36. Kitano, H.: Designing neural networks using genetic algorithms with graph generation system. Complex Syst. 4, 461–476 (1990)

    MATH  Google Scholar 

  37. Kleim, J.A., Lussnig, E., Schwartz, E.R., Comery, T.A., Greenough, W.T.: Synaptogenesis and fos expression in the motor cortex of the adult rat after motor skill learning. J. Neurosci. 16, 4529–4535 (1996)

    Article  Google Scholar 

  38. Kleim, J.A., Vij, K., Ballard, D.H., Greenough, W.T.: Learning-dependent synaptic modifications in the cerebellar cortex of the adult rat persist for at least four weeks. J. Neurosci. 17, 717–721 (1997)

    Article  Google Scholar 

  39. Kodjabachian, J., Meyer, J.A.: Evolution and development of neural controllers for locomotion, gradient-following, and obstacle-avoidance in artificial insects. IEEE Trans. Neural Netw. 9, 796–812 (1998)

    Article  Google Scholar 

  40. Koutník, J., Gomez, F., Schmidhuber, J.: Evolving neural networks in compressed weight space. In: Proceedings of the Conference on Genetic and Evolutionary Computation (GECCO-10) (2010)

    Google Scholar 

  41. Kumar, S., Bentley, P. (eds.): On Growth, Form and Computers. Academic Press (2003)

    Google Scholar 

  42. Luke, S., Spector, L.: Evolving graphs and networks with edge encoding: preliminary report. In: Late Breaking Papers at the Genetic Programming Conference, pp. 117–124 (1996)

    Google Scholar 

  43. Maguire, E.A., Gadian, D.G., Johnsrude, I.S., Good, C.D., Ashburner, J., Frackowiak, R.S.J., Frith, C.D.: Navigation-related structural change in the hippocampi of taxi drivers. PNAS 97, 4398–4403 (2000)

    Article  Google Scholar 

  44. McCloskey, M., Cohen, N.: Catastrophic interference in connectionist networks: the sequential learning problem. Psychol. Learn. Motiv. 24, 109–165 (1989)

    Article  Google Scholar 

  45. McCulloch, W., Pitts, W.: A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 5, 115–133 (1943)

    Article  MathSciNet  Google Scholar 

  46. Métin, C., Vallee, R., Rakic, P., Bhide, P.: Modes and mishaps of neuronal migration in the mammalian brain. Neuroscience 28, 11746–11752 (2008)

    Article  Google Scholar 

  47. Miller, J.F.: What bloat? cartesian genetic programming on boolean problems. In: Proceedings of the Conference on Genetic and Evolutionary Computation, Late Breaking Papers, pp. 295–302 (2001)

    Google Scholar 

  48. Miller, J.F. (ed.): Cartesian Genetic Programming. Springer, Berlin (2011)

    Google Scholar 

  49. Miller, J.F., Smith, S.L.: Redundancy and computational efficiency in cartesian genetic programming. IEEE Trans. Evol. Comput. 10(2), 167–174 (2006)

    Article  Google Scholar 

  50. Miller, J.F., Thomson, P.: Cartesian genetic programming. In: Proceedings of European Conference on Genetic Programming. LNCS, vol. 10802, pp. 121–132 (2000)

    Google Scholar 

  51. Miller, J.F., Thomson, P.: A developmental method for growing graphs and circuits. In: Proceedings of the International Conference on Evolvable Systems. LNCS, vol. 2606, pp. 93–104 (2003)

    Google Scholar 

  52. Miller, J.F., Wilson, D.G., Cussat-Blanc, S.: Evolving developmental programs that build neural networks for solving multiple problems. In: Banzhaf, W., Spector, L., Sheneman L. (eds.) Genetic Programming Theory and Practice XVI, Chap. TBC. Springer (2019)

    Google Scholar 

  53. Ooyen, A.V. (ed.): Modeling Neural Development. MIT Press, Cambridge (2003)

    Google Scholar 

  54. Rakic, P.: Principles of neural cell migration. Experientia 46, 882–891 (1990)

    Article  Google Scholar 

  55. Ratcliff, R.: Connectionist models of recognition and memory: constraints imposed by learning and forgetting functions. Psychol. Rev. 97, 205–308 (1990)

    Article  Google Scholar 

  56. Risi, S., Lehman, J., Stanley, K.O.: Evolving the placement and density of neurons in the HyperNEAT substrate. In: Proceedings of the Conference on Genetic and Evolutionary Computation, pp. 563–570 (2010)

    Google Scholar 

  57. Risi, S., Stanley, K.O.: Indirectly encoding neural plasticity as a pattern of local rules. In: From Animals to Animats 11: Conference on Simulation of Adaptive Behavior (2010)

    Google Scholar 

  58. Risi, S., Stanley, K.O.: Enhancing ES-HyperNEAT to evolve more complex regular neural networks. In: Proceedings of the Conference on Genetic and Evolutionary Computation, pp. 1539–1546 (2011)

    Google Scholar 

  59. Roggen, D., Federici, D., Floreano, D.: Evolutionary morphogenesis for multi-cellular systems. Genet. Program. Evolvable Mach. 8(1), 61–96 (2007)

    Article  Google Scholar 

  60. Rose, S.: The Making of Memory: From Molecules to Mind. Vintage (2003)

    Google Scholar 

  61. Rust, A., Adams, R., Bolouri, H.: Evolutionary neural topiary: growing and sculpting artificial neurons to order. In: Proceedings of the Conference on the Simulation and synthesis of Living Systems, pp. 146–150 (2000)

    Google Scholar 

  62. Rusu, A.A., Rabinowitz, N.C., Desjardins, G., Soyer, H., Kirkpatrick, J., Kavukcuoglu, K., Pascanu, R., Hadsell, R.: Progressive neural networks (2016). arXiv:1606.04671

  63. Ryan, C., Collins, J.J., Neill, M.O.: Grammatical evolution: evolving programs for an arbitrary language. In: Banzhaf, W., Poli, R., Schoenauer, M., Fogarty, T.C. (eds.) Genetic Programming, pp. 83–96. Springer, Berlin (1998)

    Google Scholar 

  64. Sharkey, A.J.: Combining artificial neural nets: ensemble and modular multi-net systems. Springer Science & Business Media (2012)

    Google Scholar 

  65. Siddiqi, A.A., Lucas, S.M.: A comparison of matrix rewriting versus direct encoding for evolving neural networks. In: Proceedings IEEE International Conference on Evolutionary Computation Proceedings, pp. 392–397 (1998)

    Google Scholar 

  66. Smythies, J.: The Dynamic Neuron. MIT Press, Cambridge (2002)

    Book  Google Scholar 

  67. Stanley, K., Miikkulainen, R.: Efficient evolution of neural network topologies. Proc. Congr. Evol. Comput. 2, 1757–1762 (2002)

    Google Scholar 

  68. Stanley, K.O.: Compositional pattern producing networks: a novel abstraction of development. Genet. Program. Evolvable Mach. 8, 131–162 (2007)

    Article  Google Scholar 

  69. Stanley, K.O., D’Ambrosio, D.B., Gauci, J.: A hypercube-based encoding for evolving large-scale neural networks. Artif. Life 15, 185–212 (2009)

    Article  Google Scholar 

  70. Stanley, K.O., Miikkulainen, R.: A taxonomy for artificial embryogeny. Artif. Life 9(2), 93–130 (2003)

    Article  Google Scholar 

  71. Suchorzewski, M., Clune, J.: A novel generative encoding for evolving modular, regular and scalable networks. In: Proceedings of the Conference on Genetic and Evolutionary Computation, pp. 1523–1530 (2011)

    Google Scholar 

  72. Terekhov, A.V., Montone, G., ORegan, J.K.: Knowledge transfer in deep block-modular neural networks. In: Conference on Biomimetic and Biohybrid Systems, pp. 268–279. Springer (2015)

    Google Scholar 

  73. Tierney, A., Nelson III, C.: Brain development and the role of experience in the early years. Zero Three 30, 9–13 (2009)

    Google Scholar 

  74. Tramontin, A.D., Brenowitz, E.: Seasonal plasticity in the adult brain. Trends Neurosci. 23, 251–258 (2000)

    Article  Google Scholar 

  75. Tsankova, N., Renthal, W., Kumar, A., Nestler, E.: Epigenetic regulation in psychiatric disorders. Nat. Rev. Neurosci. 8(5), 33–367 (2007)

    Article  Google Scholar 

  76. Turner, A.J., Miller, J.F.: Cartesian genetic programming encoded artificial neural networks: a comparison using three benchmarks. In: Proceedings of the Conference on Genetic and Evolutionary Computation (GECCO), pp. 1005–1012 (2013)

    Google Scholar 

  77. Turner, A.J., Miller, J.F.: Recurrent cartesian genetic programming. In: Proceedings of the Parallel Problem Solving from Nature, pp. 476–486 (2014)

    Google Scholar 

  78. Valverde, F.: Rate and extent of recovery from dark rearing in the visual cortex of the mouse. Brain Res. 33, 1–11 (1971)

    Article  Google Scholar 

  79. Vassilev, V.K., Miller, J.F.: The advantages of landscape neutrality in digital circuit evolution. In: Proceedings of the International Conference on Evolvable Systems. LNCS, vol. 1801, pp. 252–263. Springer (2000)

    Google Scholar 

  80. Yerushalmi, U., Teicher, M.: Evolving synaptic plasticity with an evolutionary cellular development model. PLOS One 3(11), e3697 (2008)

    Article  Google Scholar 

  81. Yu, T., Miller, J.F.: Neutrality and the evolvability of Boolean function landscape. In: Proceedings of the European Conference on Genetic Programming. LNCS, vol. 2038, pp. 204–217 (2001)

    Google Scholar 

  82. Zar, J.H.: Biostatistical Analysis, 2nd edn. Prentice Hall, Upper Saddle River (1984)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Julian F. Miller .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Miller, J.F., Wilson, D.G., Cussat-Blanc, S. (2020). Evolving Programs to Build Artificial Neural Networks. In: Adamatzky, A., Kendon, V. (eds) From Astrophysics to Unconventional Computation. Emergence, Complexity and Computation, vol 35. Springer, Cham. https://doi.org/10.1007/978-3-030-15792-0_2

Download citation

Publish with us

Policies and ethics