Advertisement

Modified Error Function with Added Terms for the Backpropagation Algorithm

  • Weixing Bi
  • Xugang Wang
  • Ziliang Zong
  • Zheng Tang
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3173)

Abstract

We have noted that the local minima problem in the backpropagation algorithm is usually caused by update disharmony between weights connected to the hidden layer and the output layer. To solve this problem, we propose a modified error function with added terms. By adding one term to the conventional error function, the modified error function can harmonize the update of weights connected to the hidden layer and the output layer. Thus, it can avoid the local minima problem caused by such disharmony. Moreover, some new learning parameters introduced for the added term are easy to select. Simulations on the modified XOR problem have been performed to test the validity of the modified error function.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hadjiprocopis, A.: Feed Forward Neural Network Entities. Ph.D. Thesis. City University London UK (2000)Google Scholar
  2. 2.
    Goerick, C., Seelen, W.V.: On Unlearnable Problems or A Model for Premature Saturation in Backpropagation Learning. In: Proceedings of the European Symposium on Artificial Neural Networks, Brugge Belgium, April 24-26, pp. 13–18 (1996)Google Scholar
  3. 3.
    Haykin, S.: Neural Networks, A Comprehensive Foundation. MacMillan Publishing, New York (1994)zbMATHGoogle Scholar
  4. 4.
    Wessels, L.F.A., Barnard, E., van Rooyen, E.: The Physical Correlates of Local Minima. In: Proceedings of the International Neural Network Conference, Paris, July 1990, p. 985 (1990)Google Scholar
  5. 5.
    Funahashi, K.: On the Approximate Realization of Continuous Mapping by Neural Networks. Neural Networks 2, 183–192 (1989)CrossRefGoogle Scholar
  6. 6.
    Wang, X.G., Tang, Z., Tamura, H., Ishii, M.: A Modified Error Function for Backpropagation Algorithm. Neurocomputing 57, 477–484 (2004)CrossRefGoogle Scholar
  7. 7.
    Owen, C.B., Abunawass, A.M.: Application of Simulated Annealing to the Backpropagation Model Improves Convergence. In: Proceedings of the SPIE Conference on the Science of Artificial Neural Networks II, vol. 1966, pp. 269–276 (1993)Google Scholar
  8. 8.
    Gori, M., Tesi, A.: On the Problem of Local Minima in Backpropagation. IEEE Trans. Pattern Analysis and Machine Intelligence 14(1), 76–86 (1992)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Weixing Bi
    • 1
  • Xugang Wang
    • 2
  • Ziliang Zong
    • 3
  • Zheng Tang
    • 1
  1. 1.Faculty of EngineeringToyama UniversityToyamaJapan
  2. 2.Intelligence Engineering Laboratory, Institute of SoftwareThe Chinese Academy of ScienceBeijingChina
  3. 3.Faculty of Computer ScienceShandong UniversityJinanChina

Personalised recommendations