Evolving Node Transfer Functions in Deep Neural Networks for Pattern Recognition

  • Dmytro VodianykEmail author
  • Przemysław Rokita
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10245)


Theoretical results suggest that in order to learn complicated functions that can represent high-level features in the computer vision field, one may need to use deep architectures. The popular choice among scientists and engineers for modeling deep architectures are feed-forward Deep Artificial Neural Networks. One of the latest research areas in this field is the evolution of Artificial Neural Networks: NeuroEvolution. This paper explores the effect of evolving a Node Transfer Function and its parameters, along with the evolution of connection weights and an architecture in Deep Neural Networks for Pattern Recognition problems. The results strongly indicate the importance of evolving Node Transfer Functions for shortening the time of training Deep Artificial Neural Networks using NeuroEvolution.


  1. 1.
    Bengio, Y.: Learning deep architectures for AI. Found. Trends Mach. Learn. 2, 1–127 (2009). Now PublishersCrossRefzbMATHGoogle Scholar
  2. 2.
    Pascanu, R., Montufar, G., Bengio, Y.: On the number of response regions of deep feedforward networks with piecewise linear activations. In: NIPS 2014, pp. 2924–2932 (2015)Google Scholar
  3. 3.
    Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: ICML, pp. 1096–1103 (2008)Google Scholar
  4. 4.
    Ranzato, M., Poultney, C., Chopra, S., LeCun, Y.: Efficient learning of sparse representations with an energy-based model. In: NIPS (2007)Google Scholar
  5. 5.
    Larochelle, H., Erhan, D., Courville, A., Bergstra, J., Bengio, Y.: An empirical evaluation of deep architectures on problems with many factors of variation. In: ICML, pp. 473–480 (2007)Google Scholar
  6. 6.
    Hochreiter, S., Bengio, Y., Franconi, P., Schmidhuber, J.: Gradient Flow in Recurrent Nets: The Difficulty of Learning Long-Term Dependencies. IEE Press, New York (2001)Google Scholar
  7. 7.
    Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: AISTATS, pp. 249–256 (2010)Google Scholar
  8. 8.
    Sutskever, I., Martens, J., Dahl, G., Hilton, G.: On the importance of initialization and momentum in deep learning. In: ICML (3), vol. 28, pp. 1139–1147 (2013)Google Scholar
  9. 9.
    Kent, A., Williams, J.G. (eds.): Evolutionary Artificial Neural Networks. Encyclopedia of Computer Science and Technology, vol. 33, pp. 137–170. Marcel Dekker, New York (1995)Google Scholar
  10. 10.
    Yao, X.: Evolving artificial neural networks. Proc. IEEE 87, 1423–1447 (2002)Google Scholar
  11. 11.
    David, O.E., Greental, I.: Genetic algorithms for evolving deep neural networks. In: GECCO, pp. 1451–1452 (2014)Google Scholar
  12. 12.
    Tirumala, S.S.: Implementation of evolutionary algorithms for deep architectures. In: AIC (2014)Google Scholar
  13. 13.
    Angeline, P.J., Saunders, G.M., Pollack, J.B.: An evolutionary algorithm that constructs recurrent neural networks. Neural Netw. 5, 54–65 (1994)CrossRefGoogle Scholar
  14. 14.
    Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10, 99–127 (2002)CrossRefGoogle Scholar
  15. 15.
    Mahsal, K.M., Masood, A.A., Khan, M., Miller, J.F.: Fastlearning neural networks using Cartesian genetic programming. Neurocomputing 121, 274–289 (2013)CrossRefGoogle Scholar
  16. 16.
    James, A.T., Miller, J.F.: NeuroEvolution: The Importance of Transfer Function Evolution (2013)Google Scholar
  17. 17.
    Vodianyk, D., Rokita, P.: Evolving node transfer functions in artificial neural networks for handwritten digits recognition. In: Chmielewski, L.J., Datta, A., Kozera, R., Wojciechowski, K. (eds.) ICCVG 2016. LNCS, vol. 9972, pp. 604–613. Springer, Cham (2016). doi: 10.1007/978-3-319-46418-3_54 CrossRefGoogle Scholar
  18. 18.
    USC University of Southern California: Signal, Image Processing Institute, Ming Hsieh Department of Electrical Engineering. Textures, vol. 1.
  19. 19.
    The MNIST database of handwritten digits.
  20. 20.
    Nene, S.A., Nayar, S.K., Murase, H.: Columbia Object Image Library (COIL-100). Technical report CUCS-006-96 (1996).
  21. 21.
    Recommendation ITU-R BT.601-7 (2011)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Institute of Computer ScienceWarsaw University of TechnologyWarsawPoland

Personalised recommendations