Abstract
An improved backpropagation algorithm is proposed by using the Lyapunov method to minimize the absolution error function. The improved algorithm can make both error and gradient approach zero so that the local minima problem can be avoided. In addition, since the absolute error function is used, this algorithm is more robust and faster for learning than the backpropagation with the traditional square error function when target signals include some incorrect data. This paper also proposes a method of using Lyapunov stability theory to derive a learning algorithm which directly minimize the absolute error function.
This work was supported by National Science Foundation of China under Grant 60471055 and Specialized Research Fund for the Doctoral Program of Higher Education under Grant 20040614017.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Werbos, P.J.: The Roots of Backpropagation. Wiley, New York (1994)
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning Internal Representations by Error Propagation. In: Parallel distributed Processing, pp. 318–362. MIT Press, Cambridge (1986)
Taji, K., Miyake, T., Tarmura, H.: On Error Backpropagation Algorithm Using Absolute Error Function. In: IEEE SMC 1999 Conference Proceedings, vol. 5, pp. 12–15 (1999)
Wang, X.G., Tang, Z., Tamura, H., Ishii, M.: A Modified Error Function for the Backpropagation Algorithm. Neurocomputing 57, 477–484 (2004)
Wang, X.G., Tang, Z., Tamura, H., Ishii, M., Sum, W.D.: An Improved Backpropagation Algorithm to Avoid the Local Minima Problem. Neurocomputing 56, 455–460 (2004)
Owen, C.B., Abunawass, A.M.: Application of Simulated Annealing to the Backpropagation Model Improves Convergence. In: Proceeding of the SPIE Conference on the Science of Artificial Neural Networks, vol. II, pp. 269–276 (1993)
Von Lehmen, A., Paek, E.G., Liao, P.F., Marrakchi, A., Patel, J.S.: Factors Influencing Learning by Backpropagation. In: Proceedings of the IEEE International Conference On Neural Networks, vol. I, pp. 335–341 (1988)
Fukuoka, Y., Matsuki, H., Minamitani, H., Ishida, A.: A Modified Backpropagation Method to Avoid False Local Minima. Neural Networks 11, 1059–1072 (1998)
Vitela, J.E., Reifman, J.: Premature Saturation in Backpropagation Networks: Mechanism and Necessary Conditions. Neural Networks 10, 721–735 (1997)
Wand, C., Principe, J.C.: Training Neural Networks with Additive Noise in the Desired Signal. IEEE Trans. Neural Networks 10, 1511–1517 (1999)
Battiti, R., Masulli, F.: BFGS Optimization for Faster and Automated Supervised Learning. In: Proceedings of the Internatioanl Neural Network Conference, pp. 757–760. Kluwer Academic Publishers, Paris (1990)
Kollias, S., Anastassiou, D.: An Adaptive Least Squares Algorithm for Efficient Training of Artificial Neural Networks. IEEE Trans. on Circuits and systems 36, 1092–1101 (1989)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lv, J., Yi, Z. (2005). An Improved Backpropagation Algorithm Using Absolute Error Function. In: Wang, J., Liao, X., Yi, Z. (eds) Advances in Neural Networks – ISNN 2005. ISNN 2005. Lecture Notes in Computer Science, vol 3496. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11427391_93
Download citation
DOI: https://doi.org/10.1007/11427391_93
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-25912-1
Online ISBN: 978-3-540-32065-4
eBook Packages: Computer ScienceComputer Science (R0)