Skip to main content

Backpropagation Issues with Deep Feedforward Neural Networks

  • Conference paper
  • First Online:
Innovations in Smart Cities and Applications (SCAMS 2017)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 37))

Included in the following conference series:

  • 2047 Accesses

Abstract

Backpropagation is currently the most widely applied neural network architecture. However, for some cases this architecture is less efficient while dealing with deep neural networks [8, 9] as the learning process becomes slower and the sensitivity of the neural network increases. This paper presents an experimental study of different backpropagation architectures in term of deepness of the neural network with different learning rate and activation functions in order to determine the relation between those elements and their impact on the convergence of the backpropagation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Hecht-Nielsen, R.: Theory of the Backpropagation Network. Department of Electrical and Computer Engineering University of California at San Diego La Jolla (1992)

    Google Scholar 

  2. Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015). https://doi.org/10.1016/j.neunet.2014.09.003. arXiv:1404.7828

    Article  Google Scholar 

  3. Hochreiter, S., Bengio, Y., Frasconi, P., Schmidhuber, J.: Gradient flow in recurrent nets: the difficulty of learning long-term dependencies. In: Kremer, S.C., Kolen, J.F. (eds.) A Field Guide to Dynamical Recurrent Neural Networks. IEEE Press (2001)

    Google Scholar 

  4. LeCun, Y., Cortes, C., Burges, C.J.C.: MNIST handwritten digit database, Yann LeCun, Corinna Cortes and Chris Burges. Accessed 17 Aug 2013

    Google Scholar 

  5. Sigillito, V.G., et al.: Classification of radar returns from the ionosphere using neural networks. Johns Hopkins APL Tech. Digest 10(3), 262–266 (1989)

    Google Scholar 

  6. Fisher, R.A.: The use of multiple measurements in taxonomic problems. Ann. Eugenics Part II 7, 179–188 (1936)

    Article  Google Scholar 

  7. Nielsen, M.: Chapter 5: Why are deep neural networks hard to train. In: Neural Networks and Deep Learning (2017)

    Google Scholar 

  8. Ghanou, Y., Bencheikh, G.: Architecture optimization and training for the multilayer perceptron using ant system. IAENG Int. J. Comput. Sci. 43(1), 20–26 (2016)

    Google Scholar 

  9. Ettaouil, M., Ghanou, Y.: Neural architectures optimization and genetic algorithms. WSEAS Trans. Comput. 8(3), 526–537 (2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anas El Korchi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

El Korchi, A., Ghanou, Y. (2018). Backpropagation Issues with Deep Feedforward Neural Networks. In: Ben Ahmed, M., Boudhir, A. (eds) Innovations in Smart Cities and Applications. SCAMS 2017. Lecture Notes in Networks and Systems, vol 37. Springer, Cham. https://doi.org/10.1007/978-3-319-74500-8_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-74500-8_31

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-74499-5

  • Online ISBN: 978-3-319-74500-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics