Zusammenfassung
Für Backpropagation-Netzwerke wird eine neue Methode zur neuronenspezifischen Anpassung der Lernrate vorgestellt, welche eine deutliche Beschleunigung der Konvergenz insbesondere für große und vielschichtige Netzwerke erlaubt. Gezeigt wird dies am Zwei-Spiralen-Benchmark sowie an zwei Real-world-Problemen aus Medizintechnik und Physik.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Literatur
Alpsan D, Towsey M, Ozdamar O, Tsoi AC, Ghista DN: Efficacy of Modified Backpropagation and Optimisation Methods on a Real-world Problem. Neural Networks 8 (6): 945–962, 1995.
Barnard E, Holm JEW: A comparative study of optimization techniques for backpropagation. Neurocomputing 6: 19–30, 1994.
Chen JR, Mars P: Stepsize variation methods for accelerating the back propagation algorithm. Proceedings of the International Joint Conference on Neural Networks, Washington, DC, Vol. 1, 601–604, 1990.
Chen K, Yang L, Yu X, Chi H: A self-generating modular neural network architecture for supervised learning. Neurocomputing 16: 33–48, 1997.
Fahlmann SE: Faster-Learning Variations on Back-Propagation: An Empirical Study. In: Touretzky D, Hinton GE, Sejnowski T (Hrsg.): Proc. Of the 1988 Connectionist Models Summer School, Morgan Kaufmann, 38–51, 1989.
Fahlman SE, Lebiere Ch: The cascade-correlation learning architecture. In Touretzky DS (Hrsg.): Advances in Neural Information Processing Systems 2, Morgan Kaufmann, 524–532, 1990.
Higashino J, de Greef BL, Persoon EH: Numerical analysis and adaptation method for learning rate of back propagation. Proceedings of the International Joint Conference on Neural Networks, Washington, DC, Vol. 1, 627–630, 1990.
IBM: Reference manual: Random number generation and testing. C20–8011, 1959.
Jacobs R: Increased Rates of Convergence Through Learning Rate Adaption. Neural Networks 1, 295–307, 1988.
Karayiannis NB: Accelerating the Training of Feedforward Neural Networks Using Generalized Hebbian Rules for Initializing the Internal Representations. IEEE Trans. Neural Networks 7 (2): 419–426, 1996.
Kim HB, Jung SH, Kim TG, Park KH: Fast learning method for back- propagation neural network by evolutionary adaptation of learning rates. Neurocomputing 11: 101–106, 1996.
Kuncheva LI: Initializing of an RBF network by a genetic algorithm. Neurocomputing 14: 273–288, 1997.
Lang KJ, Witbrock MJ: Learning to Tell Two Spirals Apart. In: Touretzky D, Hinton GE, Sejnowski T (Hrsg.): Proc. Of the 1988 Connectionist Models Summer School, Morgan Kaufmann, 52–59, 1989.
Lawrence S, giles CL, Tsoi AC: What size neural network gives optimal generalization? Convergence properties of backpropagation. Technical Report UMIACS-TR-96–22 and CS-TR-3617, Institute for Advanced Computer Studies, University of Maryland, College Park, MD 20742
Lengelle R, Denceux T: Training MLPs Layer by Layer Using an Objective Function for Internal Representations. Neural Networks 9 (l): 83–97, 1996.
Looney CG: Stabilization and speedup of convergence in training feedforward neural networks. Neurocomputing 10: 7–31, 1996.
Magoulas GD, Vrahatis MN, Androulakis GS: Effective Backpropagation Training with Variable Stepsize. Neural Networks 10 (l): 69–82, 1997.
Osowski S, Bojarczak P, Stodolski M: Fast Second Order Learning Algorithm for Feedforward Multilayer Neural Networks and ist Applications. Neural Networks 9 (9): 1583–1596, 1996.
Petrowski A, Dreyfus G, Girault C: Performance Analysis of a Pipelined Backpropagation Parallel Algorithm. IEEE Trans. Neural Networks 4 (6): 970–981, 1993.
Prechelt L: Some notes on neural learning algorithm benchmarking. Neurocomputing 9: 343–347, 1995.
Rathbun TF, Rogers SK, DeSimio MP, Oxley ME: MLP iterative construction algorithm. Neurocomputing 17: 195–216, 1997.
Ridella S, Rovetta S, Zunino R: Circular Backpropagation Networks for Classification. IEEE Trans. Neural Networks 8 (l): 84–97, 1997.
Riedmiller M, Braun H: A direct adaptive method for faster backpropagation learning: the RPROP algorithm, Proc. 1993 IEEE Int. Conf. Neural Networks, vol 1, SanFranzisco 586–591, 1993.
Rumelhart DE, Hinton GE, Williams RJ: Learning Representations by Back-Propagating Errors. Nature 323: 533–536, 1986.
Sarle, W.S., ed. (1997), Neural Network FAQ, part 2 of 7: Learning, periodic posting to the Usenet newsgroup comp. ai. neuralnets, URL:ftp://ftp. sas. com/pub/neural/FAQ. html
Sarle WS: Stopped Training and Other Remedies for Overfitting. Proceedings of the 27th Symposium on the Interface of Computer Science and Statistics, 352–360, 1995.
Sietsma J, Dow RJF: Creating artificial neural networks that generalize. Neural Networks 2: 67–79, 1991.
Tamura S, Tateishi M: Capabilities of a Four-Layered Feedforward Neural Network: Four Layers Versus Three. IEEE Trans. Neural Networks 8 (2): 251–255, 1997.
Tesauro G, Janssens B: Scaling relationships in back propagation learning. Complex Systems 2: 39–44, 1988.
Vogl TP, Mangis JK, Zigler AK, Zink WT, Alkon DL: Accelerating the convergence of the back-propagation method, Biol. Cybernet. 59 (4/5): 257–264, 1988.
Wagner F, Inst. f. Theoretische Physik, Christian-Albrechts-Universität zu Kiel
Weigend A: On overfitting and the effective number of hidden units. Proceedings of the 1993 Connectionist Models Summer School, 335–342, 1994.
Yu X-H, Chen G-A, Cheng S-X: Dynamic Learning Rate Optimization of the Backpropagation Algorithm. IEEE Trans. Neural Networks 6 (3): 669–677, 1995.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Linder, R., Pöppl, S.J. (1998). Optimierung der Konvergenzgeschwindigkeit von Backpropagation. In: Levi, P., Schanz, M., Ahlers, RJ., May, F. (eds) Mustererkennung 1998. Informatik aktuell. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-72282-0_18
Download citation
DOI: https://doi.org/10.1007/978-3-642-72282-0_18
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-64935-9
Online ISBN: 978-3-642-72282-0
eBook Packages: Springer Book Archive