Abstract
This chapter describes how to calculate the weights or synapses of a multilayer feedforward network. The network will learn to associate a given output with a given input by adapting its weights. The weight adaptation algorithm we consider here is the steepest descent algorithm to minimize a nonlinear function. For neural networks, this is called backpropagation and was made popular in [78, 94].
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1997 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
De Wilde, P. (1997). Backpropagation. In: Neural Network Models. Springer, London. https://doi.org/10.1007/978-1-84628-614-8_2
Download citation
DOI: https://doi.org/10.1007/978-1-84628-614-8_2
Publisher Name: Springer, London
Print ISBN: 978-3-540-76129-7
Online ISBN: 978-1-84628-614-8
eBook Packages: Springer Book Archive