Abstract
Backpropagation is currently the most widely applied neural network architecture. However, for some cases this architecture is less efficient while dealing with deep neural networks [8, 9] as the learning process becomes slower and the sensitivity of the neural network increases. This paper presents an experimental study of different backpropagation architectures in term of deepness of the neural network with different learning rate and activation functions in order to determine the relation between those elements and their impact on the convergence of the backpropagation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Hecht-Nielsen, R.: Theory of the Backpropagation Network. Department of Electrical and Computer Engineering University of California at San Diego La Jolla (1992)
Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015). https://doi.org/10.1016/j.neunet.2014.09.003. arXiv:1404.7828
Hochreiter, S., Bengio, Y., Frasconi, P., Schmidhuber, J.: Gradient flow in recurrent nets: the difficulty of learning long-term dependencies. In: Kremer, S.C., Kolen, J.F. (eds.) A Field Guide to Dynamical Recurrent Neural Networks. IEEE Press (2001)
LeCun, Y., Cortes, C., Burges, C.J.C.: MNIST handwritten digit database, Yann LeCun, Corinna Cortes and Chris Burges. Accessed 17 Aug 2013
Sigillito, V.G., et al.: Classification of radar returns from the ionosphere using neural networks. Johns Hopkins APL Tech. Digest 10(3), 262–266 (1989)
Fisher, R.A.: The use of multiple measurements in taxonomic problems. Ann. Eugenics Part II 7, 179–188 (1936)
Nielsen, M.: Chapter 5: Why are deep neural networks hard to train. In: Neural Networks and Deep Learning (2017)
Ghanou, Y., Bencheikh, G.: Architecture optimization and training for the multilayer perceptron using ant system. IAENG Int. J. Comput. Sci. 43(1), 20–26 (2016)
Ettaouil, M., Ghanou, Y.: Neural architectures optimization and genetic algorithms. WSEAS Trans. Comput. 8(3), 526–537 (2009)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
El Korchi, A., Ghanou, Y. (2018). Backpropagation Issues with Deep Feedforward Neural Networks. In: Ben Ahmed, M., Boudhir, A. (eds) Innovations in Smart Cities and Applications. SCAMS 2017. Lecture Notes in Networks and Systems, vol 37. Springer, Cham. https://doi.org/10.1007/978-3-319-74500-8_31
Download citation
DOI: https://doi.org/10.1007/978-3-319-74500-8_31
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-74499-5
Online ISBN: 978-3-319-74500-8
eBook Packages: EngineeringEngineering (R0)