Advertisement

Evolving Developmental Programs That Build Neural Networks for Solving Multiple Problems

  • Julian F. Miller
  • Dennis G. WilsonEmail author
  • Sylvain Cussat-Blanc
Chapter
Part of the Genetic and Evolutionary Computation book series (GEVO)

Abstract

A developmental model of an artificial neuron is presented. In this model, a pair of neural developmental programs develop an entire artificial neural network of arbitrary size. The pair of neural chromosomes are evolved using Cartesian Genetic Programming. During development, neurons and their connections can move, change, die or be created. We show that this two-chromosome genotype can be evolved to develop into a single neural network from which multiple conventional artificial neural networks can be extracted. The extracted conventional ANNs share some neurons across tasks. We have evaluated the performance of this method on three standard classification problems: cancer, diabetes and the glass datasets. The evolved pair of neuron programs can generate artificial neural networks that perform reasonably well on all three benchmark problems simultaneously. It appears to be the first attempt to solve multiple standard classification problems using a developmental approach.

References

  1. 1.
    Astor, J.C., Adami, C.: A development model for the evolution of artificial neural networks. Artificial Life 6, 189–218 (2000)CrossRefGoogle Scholar
  2. 2.
    Balaam, A.: Developmental neural networks for agents. In: Advances in Artificial Life, Proceedings of the 7th European Conference on Artificial Life (ECAL 2003), pp. 154–163. Springer (2003)Google Scholar
  3. 3.
    Boers, E.J.W., Kuiper, H.: Biological metaphors and the design of modular neural networks. Master’s thesis, Dept. of Comp. Sci. and Dept. of Exp. and Theor. Psych., Leiden University (1992)Google Scholar
  4. 4.
    Cangelosi, A., Nolfi, S., Parisi, D.: Cell division and migration in a ‘genotype’ for neural networks. Network-Computation in Neural Systems 5, 497–515 (1994)CrossRefGoogle Scholar
  5. 5.
    Downing, K.L.: Supplementing evolutionary developmental systems with abstract models of neurogenesis. In: Proc. Conf. on Genetic and evolutionary Comp., pp. 990–996 (2007)Google Scholar
  6. 6.
    Eggenberger, P.: Creation of neural networks based on developmental and evolutionary principles. In: W. Gerstner, A. Germond, M. Hasler, J.D. Nicoud (eds.) Artificial Neural Networks — ICANN’97, pp. 337–342 (1997)Google Scholar
  7. 7.
    Federici, D.: A regenerating spiking neural network. Neural Networks 18(5–6), 746–754 (2005)CrossRefGoogle Scholar
  8. 8.
    Fernández-Delgado, M., Cernadas, E., Barro, S., Amorim, D.: Do we need hundreds of classifiers to solve real world classification problems? J. Mach. Learn. Res. 15(1), 3133–3181 (2014)MathSciNetzbMATHGoogle Scholar
  9. 9.
    French, R.M.: Catastrophic Forgetting in Connectionist Networks: Causes, Consequences and Solutions. Trends in Cognitive Sciences 3(4), 128–135 (1999)CrossRefGoogle Scholar
  10. 10.
    Goldman, B.W., Punch, W.F.: Reducing wasted evaluations in cartesian genetic programming. In: Genetic Programming: 16th European Conference, EuroGP 2013, Vienna, Austria, April 3–5, 2013. Proceedings, pp. 61–72. Springer Berlin Heidelberg, Berlin, Heidelberg (2013)Google Scholar
  11. 11.
    Goldman, B.W., Punch, W.F.: Analysis of cartesian genetic programmings evolutionary mechanisms. Evolutionary Computation, IEEE Transactions on 19, 359–373 (2015)CrossRefGoogle Scholar
  12. 12.
    Gruau, F.: Automatic definition of modular neural networks. Adaptive Behaviour 3, 151–183 (1994)CrossRefGoogle Scholar
  13. 13.
    Gruau, F., Whitley, D., Pyeatt, L.: A comparison between cellular encoding and direct encoding for genetic neural networks. In: Proc. Conf. on Genetic Programming, pp. 81–89 (1996)Google Scholar
  14. 14.
    Harding, S., Miller, J.F., Banzhaf, W.: Developments in cartesian genetic programming: Self-modifying CGP. Genetic Programming and Evolvable Machines 11(3–4), 397–439 (2010)CrossRefGoogle Scholar
  15. 15.
    Hornby, G., Lipson, H., Pollack, J.B.: Generative representations for the automated design of modular physical robots. IEEE Trans. on Robotics and Automation 19, 703–719 (2003)CrossRefGoogle Scholar
  16. 16.
    Hornby, G.S., Pollack, J.B.: Creating high-level components with a generative representation for body-brain evolution. Artificial Life 8(3) (2002)CrossRefGoogle Scholar
  17. 17.
    Huizinga, J., Clune, J., Mouret, J.B.: Evolving neural networks that are both modular and regular: HyperNEAT plus the connection cost technique. In: Proc. Conf. on Genetic and Evolutionary Computation, pp. 697–704 (2014)Google Scholar
  18. 18.
    Jakobi, N.: Harnessing Morphogenesis, COGS Research Paper 423. Tech. rep., University of Sussex (1995)Google Scholar
  19. 19.
    Khan, G.M.: Evolution of Artificial Neural Development - In Search of Learning Genes, Studies in Computational Intelligence, vol. 725. Springer (2018)Google Scholar
  20. 20.
    Khan, G.M., Miller, J.F.: In search of intelligence: evolving a developmental neuron capable of learning. Connect. Sci. 26(4), 297–333 (2014)CrossRefGoogle Scholar
  21. 21.
    Khan, G.M., Miller, J.F., Halliday, D.M.: Evolution of Cartesian Genetic Programs for Development of Learning Neural Architecture. Evol. Computation 19(3), 469–523 (2011)CrossRefGoogle Scholar
  22. 22.
    Kitano, H.: Designing neural networks using genetic algorithms with graph generation system. Complex Systems 4, 461–476 (1990)zbMATHGoogle Scholar
  23. 23.
    Kodjabachian, J., Meyer, J.A.: Evolution and development of neural controllers for locomotion, gradient-following, and obstacle-avoidance in artificial insects. IEEE Transactions on Neural Networks 9, 796–812 (1998)CrossRefGoogle Scholar
  24. 24.
    Kumar, S., Bentley, P. (eds.): On Growth, Form and Computers. Academic Press (2003)Google Scholar
  25. 25.
    McCloskey, M., Cohen, N.: Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem. The Psychology of Learning and Motivation 24, 109–165 (1989)CrossRefGoogle Scholar
  26. 26.
    McCulloch, Pitts, W.: A logical calculus of the ideas immanent in nervous activity. The Bulletin of Mathematical Biophysics 5, 115–133 (1943)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Miller, J.F.: What bloat? cartesian genetic programming on boolean problems. In: Proc. Conf. Genetic and Evolutionary Computation, Late breaking papers, pp. 295–302 (2001)Google Scholar
  28. 28.
    Miller, J.F. (ed.): Cartesian Genetic Programming. Springer (2011)Google Scholar
  29. 29.
    Miller, J.F., Khan, G.M.: Where is the Brain inside the Brain? Memetic Computing 3(3), 217–228 (2011)CrossRefGoogle Scholar
  30. 30.
    Miller, J.F., Smith, S.L.: Redundancy and computational efficiency in Cartesian Genetic Programming. IEEE Trans. on Evolutionary Computation 10(2), 167–174 (2006)CrossRefGoogle Scholar
  31. 31.
    Miller, J.F., Thomson, P.: Cartesian genetic programming. In: Proc. European Conf. on Genetic Programming, LNCS, vol. 10802, pp. 121–132 (2000)Google Scholar
  32. 32.
    Miller, J.F., Thomson, P.: A Developmental Method for Growing Graphs and Circuits. In: Proc. Int. Conf. on Evolvable Systems, LNCS, vol. 2606, pp. 93–104 (2003)Google Scholar
  33. 33.
    Ooyen, A.V. (ed.): Modeling Neural Development. MIT Press (2003)Google Scholar
  34. 34.
    Ratcliff, R.: Connectionist Models of Recognition and Memory: Constraints Imposed by Learning and Forgetting Functions. Psychological Review 97, 205–308 (1990)CrossRefGoogle Scholar
  35. 35.
    Risi, S., Lehman, J., Stanley, K.O.: Evolving the placement and density of neurons in the HyperNEAT substrate. In: Proc. Conf. on Genetic and Evolutionary Computation, pp. 563–570 (2010)Google Scholar
  36. 36.
    Risi, S., Stanley, K.O.: Indirectly encoding neural plasticity as a pattern of local rules. In: From Animals to Animats 11: Conf. on Simulation of Adaptive Behavior (2010)Google Scholar
  37. 37.
    Risi, S., Stanley, K.O.: Enhancing ES-HyperNEAT to evolve more complex regular neural networks. In: Proc. Conf. on Genetic and Evolutionary Computation, pp. 1539–1546 (2011)Google Scholar
  38. 38.
    Rust, A., Adams, R., Bolouri, H.: Evolutionary neural topiary: Growing and sculpting artificial neurons to order. In: Proc. Conf. on the Simulation and synthesis of Living Systems, pp. 146–150 (2000)Google Scholar
  39. 39.
    Stanley, K., Miikkulainen, R.: Efficient evolution of neural network topologies. In: Proc. Congress on Evolutionary Computation, vol. 2, pp. 1757–1762 (2002)Google Scholar
  40. 40.
    Stanley, K.O.: Compositional pattern producing networks: A novel abstraction of development. Genetic Programming and Evolvable Machines 8, 131–162 (2007)CrossRefGoogle Scholar
  41. 41.
    Stanley, K.O., D’Ambrosio, D.B., Gauci, J.: A hypercube-based encoding for evolving large-scale neural networks. Artificial Life 15, 185–212 (2009)CrossRefGoogle Scholar
  42. 42.
    Stanley, K.O., Miikkulainen, R.: A taxonomy for artificial embryogeny. Artificial Life 9(2), 93–130 (2003)CrossRefGoogle Scholar
  43. 43.
    Suchorzewski, M., Clune, J.: A novel generative encoding for evolving modular, regular and scalable networks. In: Proc. Conf. on Genetic and Evolutionary Computation, pp. 1523–1530 (2011)Google Scholar
  44. 44.
    Turner, A.J., Miller, J.F.: Cartesian Genetic Programming encoded artificial neural networks: A comparison using three benchmarks. In: Proceedings of the Conference on Genetic and Evolutionary Computation (GECCO), pp. 1005–1012 (2013)Google Scholar
  45. 45.
    Turner, A.J., Miller, J.F.: Recurrent cartesian genetic programming. In: Proc. Parallel Problem Solving from Nature, pp. 476–486 (2014)Google Scholar
  46. 46.
    Vassilev, V.K., Miller, J.F.: The Advantages of Landscape Neutrality in Digital Circuit Evolution. In: Proc. Int. Conf. on Evolvable Systems, LNCS, vol. 1801, pp. 252–263. Springer Verlag (2000)Google Scholar
  47. 47.
    Yu, T., Miller, J.F.: Neutrality and the Evolvability of Boolean function landscape. In: Proc. European Conference on Genetic Programming, LNCS, vol. 2038, pp. 204–217 (2001)Google Scholar
  48. 48.
    Zar, J.H.: Biostatistical Analysis, 2nd edn. Prentice Hall (1984)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Julian F. Miller
    • 1
  • Dennis G. Wilson
    • 2
    Email author
  • Sylvain Cussat-Blanc
    • 2
  1. 1.University of YorkHeslington, YorkUK
  2. 2.University of Toulouse, IRIT - CNRS - UMR5505ToulouseFrance

Personalised recommendations