Abstract
This chapter describes artificial neural networks (ANNs) as coupled lattices of dynamic nonlinear processing elements and studies ways to adapt their parameters. This view extends the conventional paradigm of static neural networks and shines light on the principles of parameter optimization, both for the static and dynamic cases. We show how gradient descent learning can be naturally implemented with local rules in coupled lattices. We review the present state of the art in neural network training and present recent results that take advantage of the distributed nature of coupled lattices for optimization.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Almeida, L.B. (1987). A learning rule for asynchronous perceptrons with feedback in a combinatorial environment. In 1st IEEE Int. Conf. on Neural Networks, vol. 2, pages 609–618.
Amari, S. (1972). Characteristics of random nets of analog neuron-like elements. IEEE Transactions on Systems, Man, and Cybernetics, SMC2(5): 643–657.
Baldi, P. (1995). Gradient descent learning algorithms: A unified approach. In Chauvin and Rumelhart, editors, Backpropagation: theory, architectures and applications, pages 509–541. Lawrence Erlbaum Associates, New Jersey.
Birkhoff, G. (1967). Lattice Theory. American Mathematical Society.
Bryson, A. and Ho, Y. (1975). Applied Optimal Control, Optimization, estimation and control. Hemispheric Publishing Co., New York.
deVries, B. and Principe, J. (1992). The gamma model–a new neural model for temporal processing. Neural Networks, 5 (4): 565–576.
Grossberg, S. and Cohen, M. (1983). Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Transactions on Systems, Man, and Cybernetics, SMC13:815–826.
Hopfield, J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the USA, 79(2554–2558).
Jordan, M. (1986). Attractor dynamics and parallelism in a connectionist sequential machine. In Proc. 8th annual Conf. on Cognition Science Society, pages 531–546.
Lang, K., Waibel, A., and Hinton, G. (1990). A time dealy neural network architecture for isolated word recognition. Neural Networks, 3 (1): 23–44.
LeCun, Y. (1988). A theoretical framework for backpropagation. Technical Report CRG-TR-88–6, Department of Computer Science, University of Toronto, Toronto, Canada.
Lefebvre, C. (1992). An object-oriented methodology for the analysis of artificial neural networks. Master’s thesis, University of Florida, Gainesville, Florida.
Lippman, R. (1987). An introduction to computing with neural nets. IEEE Trans. ASSP Magazine, 4: 4–22.
Pineda, F. (1987). Generalization of backpropagation to recurrent neural networks. Physical Rev. Let., 59 (19): 2229–2232.
Pontryagin, L.S. (1962). The mathemetical theory of optimal processes. Interscience, New York.
Principe, J., Euliano, N., and Lefebvre, C. (2000). Neural and Adaptive systems: Fundamentals through simulation. Wiley, New York, New York.
Rumelhart, D., Hinton, G., and Williams, R. (1986). Learning internal representations by error propagation. In Rumelhart and McClelland, editors, Parallel Distributed Processing. MIT Press.
Werbos, P. (1974). Beyond regression: new tools for prediction and analysis in the behavioral sciences. PhD thesis, Harvard University.
Werbos, P. (1990). Backpropagation through time: what it does and how to do it. Proc. IEEE, 78 (10).
Widrow, B. and Hoff, M. (1960). Adaptive switching circuits. IRE Wescon rep pt 4.
Williams, R. and Zipser, D. (1989). A learning algorithm for continually running fully recurrent neural networks. Neural Computation, 1: 270–280.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Principe, J.C., Lefebvre, C., Fancourt, C.L. (2002). Dataflow Learning in Coupled Lattices: An Application to Artificial Neural Networks. In: Pardalos, P.M., Romeijn, H.E. (eds) Handbook of Global Optimization. Nonconvex Optimization and Its Applications, vol 62. Springer, Boston, MA. https://doi.org/10.1007/978-1-4757-5362-2_10
Download citation
DOI: https://doi.org/10.1007/978-1-4757-5362-2_10
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4419-5221-9
Online ISBN: 978-1-4757-5362-2
eBook Packages: Springer Book Archive