Abstract
Feedforward neural networks are particularly useful in learning a training dataset without prior knowledge. However, weight adjusting with a gradient descent may result in the local minimum problem. Repeated training with random starting weights is among the popular methods to avoid this problem, but it requires extensive computational time. This paper proposes a simultaneous training method with removal criteria to eliminate less promising neural networks, which can decrease the probability of achieving a local minimum while efficiently utilizing resources. The experimental results demonstrate the effectiveness and efficiency of the proposed training method in comparison with conventional training.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Ferrari, S., Stengel, R.F.: Smooth Function Approximation Using Neural Networks. IEEE Transactions on Neural Networks 16, 24–38 (2005)
Bishop, C.M.: Neural Networks for Pattern Recognition. Clarendon Press, Oxford (1995)
Park, Y.R., Murray, T.J., Chung, C.: Predicting Sun Spots Using a Layered Perceptron Neural Network. IEEE Transactions on Neural Networks 7, 501–505 (1996)
Iyer, M.S., Rhinehart, R.R.: A Method to Determine the Required Number of Neural-Network Training Repetitions. IEEE Transactions on Neural Networks 10, 427–432 (1999)
Riedmiller, M., Braun, H.: A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm. In: Proceedings of the IEEE International Conference on Neural Networks, pp. 586–591 (1993)
Poston, T., Lee, C.N., Choie, Y., Kwon, Y.: Local Minima and Back Propagation. In: IJCNN-91-Seattle International Joint Conference on Neural Networks, vol. 2, pp. 173–176 (1991)
Yu, X.H.: Can Backpropagation Error Surface Not Have Local Minima. IEEE Transactions on Neural Networks 3, 1019–1021 (1992)
Gori, M., Tesi, A.: On the Problem of Local Minima in Backpropagation. IEEE Transactions on Pattern Analysis and Machine Intelligence 14, 76–86 (1992)
Xiao-Hu, Y., Guo-An, C.: On the Local Minima Free Condition of Backpropagation Learning. IEEE Transactions on Neural Networks 6, 1300–1303 (1995)
Fukumizu, K., Amari, S.: Local Minima and Plateaus in Multilayer Neural Networks. In: 9th International Conference on Artificial Neural Networks, vol. 2, pp. 597–602 (1999)
De-Shuang, H.: The Local Minima-Free Condition of Feedforward Neural Networks for Outer-Supervised Learning. IEEE Transactions on Systems, Man and Cybernetics, part B 28, 477–480 (1998)
Sprinkhuizen-Kuyper, I.G., Boers, E.J.W.: A Local Minimum for the 2-3-1 XOR Network. IEEE Transactions on Neural Networks 10, 968–971 (1999)
Cetin, B.C., Burdick, J.W., Barhen, J.: Global Descent Replaces Gradient Descent to Avoid Local Minima Problem in Learning with Artificial Neural Networks. IEEE International Conference on Neural Networks 2, 836–842 (1993)
Toh, K.A.: Deterministic Global Optimization for FNN Training. IEEE Transactions on Systems, Man and Cybernetics, part B 33, 977–983 (2003)
Jordanov, I.N., Rafik, T.A.: Local Minima Free Neural Network Learning. In: 2nd International IEEE Conference on Intelligent Systems, vol. 1, pp. 34–39 (2004)
Wessels, L.F.A., Barnard, E.: Avoiding False Local Minima by Proper Initialization of Connections. IEEE Transactions on Neural Networks 3, 899–905 (1992)
Yao, X., Liu, Y.: A New Evolutionary System for Evolving Artificial Neural Networks. IEEE Transactions on Neural Networks 8, 694–713 (1997)
Yao, X.: Evolving Artificial Neural Networks. Proceedings of the IEEE 87, 1423–1447 (1999)
Sexton, R.S., Gupta, J.N.D.: Comparative Evaluation of Genetic Algorithm and Backpropagation for Training Neural Networks. Information Sciences 129, 45–59 (2000)
Cantu-Paz, E., Kamath, C.: An Empirical Comparison of Combinations of Evolutionary Algorithms and Neural Networks for Classification Problems. IEEE Transactions on Systems, Man and Cybernetics, part B 35, 915–927 (2005)
Prechelt, L.: Proben1: A Set of Neural Network Benchmark Problems and Benchmarking Rules. Technical report 21/94, Univ. Karlsruhe, Karlsruhe, Germany (1994)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Atakulreka, A., Sutivong, D. (2007). Avoiding Local Minima in Feedforward Neural Networks by Simultaneous Learning. In: Orgun, M.A., Thornton, J. (eds) AI 2007: Advances in Artificial Intelligence. AI 2007. Lecture Notes in Computer Science(), vol 4830. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-76928-6_12
Download citation
DOI: https://doi.org/10.1007/978-3-540-76928-6_12
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-76926-2
Online ISBN: 978-3-540-76928-6
eBook Packages: Computer ScienceComputer Science (R0)