Advertisement

Evolutionary Optimisation of Fully Connected Artificial Neural Network Topology

  • Jordan J. BirdEmail author
  • Anikó Ekárt
  • Christopher D. Buckingham
  • Diego R. Faria
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 997)

Abstract

This paper proposes an approach to selecting the amount of layers and neurons contained within Multilayer Perceptron hidden layers through a single-objective evolutionary approach with the goal of model accuracy. At each generation, a population of Neural Network architectures are created and ranked by their accuracy. The generated solutions are combined in a breeding process to create a larger population, and at each generation the weakest solutions are removed to retain the population size inspired by a Darwinian ‘survival of the fittest’. Multiple datasets are tested, and results show that architectures can be successfully improved and derived through a hyper-heuristic evolutionary approach, in less than 10% of the exhaustive search time. The evolutionary approach was further optimised through population density increase as well as gradual solution max complexity increase throughout the simulation.

Keywords

Neural networks Evolutionary computation Neuroevolution Hyperheuristics Computational intelligence 

Notes

Acknowledgments

This work was supported by the European Commission through the H2020 project EXCELL (https://www.excell-project.eu/), grant No. 691829.

This work was also partially supported by the EIT Health GRaCEAGE grant number 18429 awarded to C.D. Buckingham.

References

  1. 1.
    Angeline, P.J., Saunders, G.M., Pollack, J.B.: An evolutionary algorithm that constructs recurrent neural networks. IEEE Trans. Neural Netw. 5(1), 54–65 (1994)Google Scholar
  2. 2.
    Bengio, Y., Goodfellow, I.J., Courville, A.: Deep learning. Nature 521(7553), 436–444 (2015)Google Scholar
  3. 3.
    Blake, C., Merz, C.J.: UCI repository of machine learning databases (1998). http://www.ics.uci.edu/mlearn/mlrepository.html, department of information and computer science. University of California, Irvine, CA
  4. 4.
    Darwin, C.: On the Origin of Species, 1859. Routledge, London (2004)Google Scholar
  5. 5.
    Floreano, D., Dürr, P., Mattiussi, C.: Neuroevolution: from architectures to learning. Evol. Intell. 1(1), 47–62 (2008)Google Scholar
  6. 6.
    Hopfield, J.J.: Neurons with graded response have collective computational properties like those of two-state neurons. Proc. Nat. Acad. Sci. 81(10), 3088–3092 (1984)Google Scholar
  7. 7.
    Donald Ervin Knuth: Postscript about NP-hard problems. ACM SIGACT News 6(2), 15–16 (1974)Google Scholar
  8. 8.
    Kohavi, R., et al.: A study of cross-validation and bootstrap for accuracy estimation and model selection. In: Ijcai, Montreal, Canada, vol. 14, pp. 1137–1145 (1995)Google Scholar
  9. 9.
    Rosenblatt, F.: Principles of neurodynamics. Perceptrons and the theory of brain mechanisms. Technical report, Cornell Aeronautical Lab Inc., Buffalo, NY (1961)Google Scholar
  10. 10.
    Shi, Y., et al.: Particle swarm optimization: developments, applications and resources. In: Proceedings of the 2001 Congress on Evolutionary Computation, vol. 1, pp. 81–86. IEEE (2001)Google Scholar
  11. 11.
    Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014)
  12. 12.
    Stanley, K.O., Bryant, B.D., Miikkulainen, R.: Real-time neuroevolution in the NERO video game. IEEE Trans. Evol. Comput. 9(6), 653–668 (2005)Google Scholar
  13. 13.
    Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)Google Scholar
  14. 14.
    Togelius, J., Karakovskiy, S., Koutník, J., Schmidhuber, J.: Super mario evolution. In: 2009 IEEE Symposium on Computational Intelligence and Games, CIG 2009, pp. 156–161. IEEE (2009)Google Scholar
  15. 15.
    Vikhar, P.A.: Evolutionary algorithms: a critical review and its future prospects. In: 2016 International Conference on Global Trends in Signal Processing, Information Computing and Communication (ICGTSPICC), pp. 261–265. IEEE (2016)Google Scholar
  16. 16.
    Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Jordan J. Bird
    • 1
    Email author
  • Anikó Ekárt
    • 1
  • Christopher D. Buckingham
    • 1
  • Diego R. Faria
    • 1
  1. 1.School of Engineering and Applied ScienceAston UniversityBirminghamUK

Personalised recommendations