Compute-Efficient Neural Network Architecture Optimization by a Genetic Algorithm

  • Sebastian LitzingerEmail author
  • Andreas Klos
  • Wolfram Schiffmann
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11728)


A neural network’s topology greatly influences its generalization ability. Many approaches to topology optimization employ heuristics, for example genetic algorithms, oftentimes consuming immense computational resources. In this contribution, we present a genetic algorithm for network topology optimization which can be deployed effectively in low-resource settings. To this end, we utilize the TensorFlow framework for network training and operate with several techniques reducing the computational load. The genetic algorithm is subsequently applied to the MNIST image classification task in two different scenarios.


Neural networks Neural Architecture Search Genetic algorithms 


  1. 1.
    Abadi, M., et al.: TensorFlow: large-scale machine learning on heterogeneous systems (2015).
  2. 2.
    Assunção, F., et al.: DENSER: deep evolutionary network structured representation. CoRR abs/1801.01563 (2018).
  3. 3.
    Assunção, F., Lourenço, N., Machado, P., Ribeiro, B.: Evolving the topology of large scale deep neural networks. In: Castelli, M., Sekanina, L., Zhang, M., Cagnoni, S., García-Sánchez, P. (eds.) EuroGP 2018. LNCS, vol. 10781, pp. 19–34. Springer, Cham (2018). Scholar
  4. 4.
    Baldominos, A., Saez, Y., Isasi, P.: Evolutionary convolutional neural networks: an application to handwriting recognition. Neurocomputing 283, 38–52 (2018)CrossRefGoogle Scholar
  5. 5.
    Ciresan, D.C., Meier, U., Schmidhuber, J.: Multi-column deep neural networks for image classification. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3642–3649 (2012)Google Scholar
  6. 6.
    Ciresan, D.C., et al.: Deep, big, simple neural nets for handwritten digit recognition. Neural Comput. 22(12), 3207–3220 (2010)CrossRefGoogle Scholar
  7. 7.
    Ciresan, D.C., et al.: Flexible, high performance convolutional neural networks for image classification. In: Walsh, T. (ed.) Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence, pp. 1237–1242 (2011)Google Scholar
  8. 8.
    He, K., et al.: Deep residual learning for image recognition. CoRR abs/1512.03385 (2015).
  9. 9.
    LeCun, et al.: The MNIST database of handwritten digits. Accessed 21 Sept 2018
  10. 10.
    Liu, H., et al.: Hierarchical representations for efficient architecture search. CoRR abs/1711.00436 (2017).
  11. 11.
    Ma, B., et al.: Autonomous deep learning: a genetic DCNN designer for image classification. CoRR abs/1807.00284 (2018).
  12. 12.
    Miller, G., et al.: Designing neural networks using genetic algorithms. In: Proceedings of the 3rd International Conference on Genetic Algorithms, pp. 379–384. Morgan Kaufmann Publishers Inc., San Francisco (1989)Google Scholar
  13. 13.
    Mitschke, N., et al.: Gradient based evolution to optimize the structure of convolutional neural networks. In: Proceedings of the 25th IEEE Conference on Image Processing (ICIP), pp. 3438–3442 (2018)Google Scholar
  14. 14.
    Real, E., et al.: Regularized evolution for image classifier architecture search. CoRR abs/1802.01548 (2018).
  15. 15.
    Schwartz, D., et al.: Exhaustive learning. Neural Comput. 2, 371–382 (1990)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Sebastian Litzinger
    • 1
    Email author
  • Andreas Klos
    • 1
  • Wolfram Schiffmann
    • 1
  1. 1.Faculty of Mathematics and Computer ScienceFernUniversität in HagenHagenGermany

Personalised recommendations