Extreme Reformulated Radial Basis Function Neural Networks

  • Gexin Bi
  • Fang Dong
Part of the Advances in Intelligent and Soft Computing book series (AINSC, volume 56)


Gradient descent based learning algorithms are generally very slow due to improper learning steps or may easily converge to local minima. And many iterative learning steps may be required by such learning algorithms in order to obtain better learning performance. This paper proposes a new learning algorithm for R-RBFNs which randomly chooses hidden nodes and analytically determines the output weights of R-RBFNs. The experimental results based on a few benchmark problems has shown that the proposed algorithm tends to provide better generalization performance at extremely fast learning speed.


Reformulated radial basis function neural network Gradient descent based learning algorithm Admissible radial basis function Generator function 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Karayiannis, N.B., Randolph-Gips, M.M.: On the Construction and Training of Reformulated Radial Basis Function Neural Networks. IEEE Trans. Neur. Netw. 14, 835–846 (2003)CrossRefGoogle Scholar
  2. 2.
    Micchelli, C.A.: Interpolation of Scattered Data: Distance Matrices and Conditionally Positive Definite Functions. Constr. Approx. 2, 11–22 (1986)MATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Broomhead, D.S., Lowe, D.: Multivariable Functional Interpolation and Adaptive Networks. Comp. Sys. 2, 321–355 (1988)MATHMathSciNetGoogle Scholar
  4. 4.
    Moody, J.E., Darken, C.J.: Fast Learning in Networks of Locally-Tuned Processing Units. Neur. Comput. 1, 281–294 (1989)CrossRefGoogle Scholar
  5. 5.
    Chen, T., Chen, H.: Approximation Capability to Functions of Several Variables, Nonlinear Functionals, and Operators by Radial Basis Function Neural Networks. IEEE Trans. Neur. Netw. 6, 904–910 (1995)CrossRefGoogle Scholar
  6. 6.
    Karayiannis, N.B.: Reformulated Radial Basis Neural Networks Trained by Gradient Descent. IEEE Trans. Neur. Netw. 10, 657–671 (1999)CrossRefGoogle Scholar
  7. 7.
    Liang, N.Y., Huang, G.B., Saratchandran, P., Sundararajan, N.: A Fast and Accurate Online Sequential Learning Algorithm for Feedforward Networks. IEEE Trans. Neur. Netw. 17, 1411–1423 (2006)CrossRefGoogle Scholar
  8. 8.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme Learning Machine: Theory and Applications. Neur. Comput. 70, 489–501 (2006)Google Scholar
  9. 9.
    Huang, G.B., Liang, N.Y., Rong, H.J., Saratchandran, P., Sundararajan, N.: On-Line Sequential Extreme Learning Machine. In: Proc. IASTED International Conference on Computational Intelligence, Calgary, Canada (2005)Google Scholar
  10. 10.
    Huang, G.B., Chen, L., Siew, C.K.: Universal Approximation Using Incremental Constructive Feedforward Networks with Random Hidden Nodes. IEEE Trans. Neur. Netw. 17, 879–892 (2006)CrossRefGoogle Scholar
  11. 11.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme Learning Machine: Theory and Applications. Neur. Computing 70, 489–501 (2006)Google Scholar
  12. 12.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme Learning Machine: A New Learning Scheme of Feedforward Neural Networks. In: Proceedings of International Joint Conference on Neural Networks, pp. 985–990. IEEE Press, New York (2004)Google Scholar
  13. 13.
    Huang, G.B., Siew, C.K.: Extreme Learning Machine with Randomly Assigned RBF Kernels. Inter. J. Inform. Tech. 11, 16–24 (2005)Google Scholar
  14. 14.
    Chen, S., Cowan, C., Grant, P.: Orthogonal Least Squares Learning Algorithm for Radial Basis Function Networks. IEEE Trans. Neur. Netw. 2, 302–309 (1991)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Gexin Bi
    • 1
  • Fang Dong
    • 1
  1. 1.College of NavigationDalian Maritime UniversityDalianChina

Personalised recommendations