Advertisement

Improving NeuroEvolution Efficiency by Surrogate Model-Based Optimization with Phenotypic Distance Kernels

  • Jörg StorkEmail author
  • Martin Zaefferer
  • Thomas Bartz-Beielstein
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11454)

Abstract

In NeuroEvolution, the topologies of artificial neural networks are optimized with evolutionary algorithms to solve tasks in data regression, data classification, or reinforcement learning. One downside of NeuroEvolution is the large amount of necessary fitness evaluations, which might render it inefficient for tasks with expensive evaluations, such as real-time learning. For these expensive optimization tasks, surrogate model-based optimization is frequently applied as it features a good evaluation efficiency. While a combination of both procedures appears as a valuable solution, the definition of adequate distance measures for the surrogate modeling process is difficult. In this study, we will extend cartesian genetic programming of artificial neural networks by the use of surrogate model-based optimization. We propose different distance measures and test our algorithm on a replicable benchmark task. The results indicate that we can significantly increase the evaluation efficiency and that a phenotypic distance, which is based on the behavior of the associated neural networks, is most promising.

Keywords

Neuroevolution Surrogate models Kernel Distance Optimization 

Notes

Acknowledgements

This work is supported by the German Federal Ministry of Education and Research in the funding program Forschung an Fachhochschulen under the grant number 13FH007IB6.

References

  1. 1.
    Basheer, I.A., Hajmeer, M.: Artificial neural networks: fundamentals, computing, design, and application. J. Microbiol. Methods 43(1), 3–31 (2000)CrossRefGoogle Scholar
  2. 2.
    Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)CrossRefGoogle Scholar
  3. 3.
    Miller, J.F., Thomson, P.: Cartesian genetic programming. In: Poli, R., Banzhaf, W., Langdon, W.B., Miller, J., Nordin, P., Fogarty, T.C. (eds.) EuroGP 2000. LNCS, vol. 1802, pp. 121–132. Springer, Heidelberg (2000).  https://doi.org/10.1007/978-3-540-46239-2_9CrossRefGoogle Scholar
  4. 4.
    Turner, A.J., Miller, J.F.: Cartesian genetic programming encoded artificial neural networks: a comparison using three benchmarks. In: Proceedings of GECCO 2013, pp. 1005–1012. ACM (2013)Google Scholar
  5. 5.
    Koziel, S., Leifsson, L.: Surrogate-based Modeling and Optimization. Applications in Engineering. Springer, New York (2013).  https://doi.org/10.1007/978-1-4614-7551-4CrossRefzbMATHGoogle Scholar
  6. 6.
    Bartz-Beielstein, T., Zaefferer, M.: Model-based methods for continuous and discrete global optimization. Appl. Soft Comput. 55, 154–167 (2017)CrossRefGoogle Scholar
  7. 7.
    Zaefferer, M., Stork, J., Friese, M., Fischbach, A., Naujoks, B., Bartz-Beielstein, T.: Efficient global optimization for combinatorial problems. In: Proceedings of GECCO 2014, pp. 871–878. ACM (2014)Google Scholar
  8. 8.
    Stork, J., Zaefferer, M., Bartz-Beielstein, T.: Distance-based kernels for surrogate model-based neuroevolution. arXiv preprint arXiv:1807.07839 (2018)
  9. 9.
    Zaefferer, M., Stork, J., Flasch, O., Bartz-Beielstein, T.: Linear combination of distance measures for surrogate models in genetic programming. In: Auger, A., Fonseca, C.M., Lourenço, N., Machado, P., Paquete, L., Whitley, D. (eds.) PPSN 2018. LNCS, vol. 11102, pp. 220–231. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-99259-4_18CrossRefGoogle Scholar
  10. 10.
    Gaier, A., Asteroth, A., Mouret, J.B.: Data-efficient neuroevolution with kernel-based surrogate models. In: Genetic and Evolutionary Computation Conference (GECCO) (2018)Google Scholar
  11. 11.
    Hildebrandt, T., Branke, J.: On using surrogates with genetic programming. Evol. Comput. 23(3), 343–367 (2015)CrossRefGoogle Scholar
  12. 12.
    Stork, J., Bartz-Beielstein, T., Fischbach, A., Zaefferer, M.: Surrogate assisted learning of neural networks. In: GMA CI-Workshop 2017 (2017)Google Scholar
  13. 13.
    Turner, A.J., Miller, J.F.: Introducing a cross platform open source Cartesian genetic programming library. Genet. Program. Evolvable Mach. 16(1), 83–91 (2015)CrossRefGoogle Scholar
  14. 14.
    Forrester, A., Sobester, A., Keane, A.: Engineering Design via Surrogate Modelling. Wiley, Hoboken (2008)CrossRefGoogle Scholar
  15. 15.
    Moraglio, A., Kattan, A.: Geometric generalisation of surrogate model based optimisation to combinatorial spaces. In: Merz, P., Hao, J.-K. (eds.) EvoCOP 2011. LNCS, vol. 6622, pp. 142–154. Springer, Heidelberg (2011).  https://doi.org/10.1007/978-3-642-20364-0_13CrossRefGoogle Scholar
  16. 16.
    Mockus, J., Tiesis, V., Zilinskas, A.: The application of Bayesian methods for seeking the extremum. In: Towards Global Optimization 2, North-Holland, pp. 117–129 (1978)Google Scholar
  17. 17.
    Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Zaefferer, M.: Combinatorial Efficient Global Optimization in R - CEGO v2.2.0. https://cran.r-project.org/package=CEGO (2017), https://cran.r-project.org/package=CEGO. Accessed 10 Jan 2018
  19. 19.
    Zeng, Z., Tung, A.K.H., Wang, J., Feng, J., Zhou, L.: Comparing stars: on approximating graph edit distance. Proc. VLDB Endow. 2(1), 25–36 (2009)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Jörg Stork
    • 1
    Email author
  • Martin Zaefferer
    • 1
  • Thomas Bartz-Beielstein
    • 1
  1. 1.Institute for Data Science, Engineering, and AnalyticsTH KölnGummersbachGermany

Personalised recommendations