Abstract
Determination of appropriate neural-network (NN) structure is an important issue for a given learning or training task since the NN performance depends much on it. To remedy the weakness of conventional BP neural networks and learning algorithms, a new Laguerre orthogonal basis neural network is constructed. Based on this special structure, a weights-direct-determination method is derived, which could obtain the optimal weights of such a neural network directly (or to say, just in one step). Furthermore, a growing algorithm is presented for determining immediately the smallest number of hidden-layer neurons. Theoretical analysis and simulation results substantiate the efficacy of such a Laguerre-orthogonal-basis neural network and its growing algorithm based on the weights-direct-determination method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Zhang, Y., Wang, J.: Recurrent Neural Networks for Nonlinear Output Regulation. Automatica 37(8), 1161–1173 (2001)
Zhang, Y., Ge, S.S., Lee, T.H.: A Unified Quadratic-Programming-Based Dynamical System Approach to Joint Torque Optimization of Physically Constrained Redundant Manipulators. IEEE Transactions on Systems, Man, and Cybernetics 34(5), 2126–2132 (2004)
Zhang, Y., Jiang, D., Wang, J.: A Recurrent Neural Network for Solving Sylvester Equation with Time-Varying Coefficients. IEEE Transactions on Neural Networks 13(5), 1053–1063 (2002)
Zhang, Y., Wang, J.: Global Exponential Stability of Recurrent Neural Networks for Synthesizing Linear Feedback Control Systems via Pole Assignment. IEEE Transactions on Neural Networks 13(3), 633–644 (2002)
Zhang, Y., Ge, S.S.: Design and Analysis of a General Recurrent Neural Network Model for Time-Varying Matrix Inversion. IEEE Transactions on Neural Networks 16(6), 1477–1490 (2005)
Lippmann, R.P.: An Introduction to Computing with Neural Nets. IEEE ASSP Magazine 4(2), 4–22 (1987)
Carl, G.L.: Advances in Feedforward Neural Networks: Demystifying Knowledge Acquiring Black Boxes. IEEE Transactions on Knowledge and Data Engineering 8(2), 211–226 (1996)
Daniel, S.Y., Zeng, X.Q.: Hidden Neuron Pruning for Multilayer Perceptrons Using a Sensitivity Measure. In: Proceedings of the First International Conference on Machine Learning and Cybernetics, pp. 1751–1757. IEEE Press, New York (2002)
Niu, B., Zhu, Y.L., Hu, K.Y., Li, S.F., He, X.X.: A Cooperative Evolutionary System for Designing Neural Networks. In: Huang, D.-S., Li, K., Irwin, G.W. (eds.) ICIC 2006. LNCS, vol. 4113, pp. 12–21. Springer, Heidelberg (2006)
Wilson, D.R., Martinez, T.R.: The Need for Small Learning Rates on Large Problems. In: Proceedings of International Joint Conference on Neural Networks, vol. 1, pp. 115–119. IEEE Press, New York (2001)
Yu, X.H., Chen, G.A.: On the Local Minima Free Condition of Backpropagation Learning. IEEE Transactions on Neural Networks 6(5), 1300–1303 (1995)
Lin, C.S.: Numerical Analysis (in Chinese). Science Press, Beijing (2006)
Kincaid, D., Cheney, W.: Numerical Analysis: Mathematics of Scientific Computing. China Machine Press, Beijing (2003)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhang, Y., Zhong, T., Li, W., Xiao, X., Yi, C. (2008). Growing Algorithm of Laguerre Orthogonal Basis Neural Network with Weights Directly Determined. In: Huang, DS., Wunsch, D.C., Levine, D.S., Jo, KH. (eds) Advanced Intelligent Computing Theories and Applications. With Aspects of Artificial Intelligence. ICIC 2008. Lecture Notes in Computer Science(), vol 5227. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-85984-0_8
Download citation
DOI: https://doi.org/10.1007/978-3-540-85984-0_8
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-85983-3
Online ISBN: 978-3-540-85984-0
eBook Packages: Computer ScienceComputer Science (R0)