Modified Error Function with Added Terms for the Backpropagation Algorithm
We have noted that the local minima problem in the backpropagation algorithm is usually caused by update disharmony between weights connected to the hidden layer and the output layer. To solve this problem, we propose a modified error function with added terms. By adding one term to the conventional error function, the modified error function can harmonize the update of weights connected to the hidden layer and the output layer. Thus, it can avoid the local minima problem caused by such disharmony. Moreover, some new learning parameters introduced for the added term are easy to select. Simulations on the modified XOR problem have been performed to test the validity of the modified error function.
Unable to display preview. Download preview PDF.
- 1.Hadjiprocopis, A.: Feed Forward Neural Network Entities. Ph.D. Thesis. City University London UK (2000)Google Scholar
- 2.Goerick, C., Seelen, W.V.: On Unlearnable Problems or A Model for Premature Saturation in Backpropagation Learning. In: Proceedings of the European Symposium on Artificial Neural Networks, Brugge Belgium, April 24-26, pp. 13–18 (1996)Google Scholar
- 4.Wessels, L.F.A., Barnard, E., van Rooyen, E.: The Physical Correlates of Local Minima. In: Proceedings of the International Neural Network Conference, Paris, July 1990, p. 985 (1990)Google Scholar
- 7.Owen, C.B., Abunawass, A.M.: Application of Simulated Annealing to the Backpropagation Model Improves Convergence. In: Proceedings of the SPIE Conference on the Science of Artificial Neural Networks II, vol. 1966, pp. 269–276 (1993)Google Scholar