Skip to main content

Generic Back-Propagation in Arbitrary FeedForward Neural Networks

  • Conference paper
Artificial Neural Nets and Genetic Algorithms

Abstract

In this paper, we describe a general mathematical model for feedforward neural networks. The final form of the network is a vectorial function f of two variables, x (the input of the network) and w (the weight vector). We show that the differential of f can be computed with an extended back-propagation algorithm or with a direct method. By evaluating the time needed to compute the differential with the help of both methods, we show how to chose the best one. We introduce also input sharing and output function which allow us to implement efficiently a multilayer perceptron with our model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Léon Bottou and Patrick Gallinari. A Framework for the Cooperation of Learning Algorithms. In R.P. Lippmann, J.E. Moody, and D.S. Touretzky, editors, Neural Information Processing Systems, volume 3, pages 781–788. Morgan Kauffman, 1991.

    Google Scholar 

  2. Cédric Gégout, Bernard Girau, and Fabrice Rossi. Les réseaux d’opérateurs. Technical report, Thomson- CSF/SDC, Juillet 1993.

    Google Scholar 

  3. Cédric Gégout, Bernard Girau, and Fabrice Rossi. NSK, an Object-Oriented Simulator Kernel for Arbitrary Feedforward Neural Networks. In Int. Conf on Tools with Artificial Intelligence, pages 93–104, New Orleans (Louisiana), November 1994. IEEE.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 1995 Springer-Verlag/Wien

About this paper

Cite this paper

Gégout, C., Girau, B., Rossi, F. (1995). Generic Back-Propagation in Arbitrary FeedForward Neural Networks. In: Artificial Neural Nets and Genetic Algorithms. Springer, Vienna. https://doi.org/10.1007/978-3-7091-7535-4_45

Download citation

  • DOI: https://doi.org/10.1007/978-3-7091-7535-4_45

  • Publisher Name: Springer, Vienna

  • Print ISBN: 978-3-211-82692-8

  • Online ISBN: 978-3-7091-7535-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics