Performance Evaluation of Recurrent RBF Network in Nearest Neighbor Classification

  • Mehmet Kerem Müezzinoğlu
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3949)


Superposition of radial basis functions centered at given prototype patterns constitutes one of the most suitable energy forms for gradient systems that perform nearest neighbor classification with real-valued static prototypes. It has been shown in [1] that a continuous-time dynamical neural network model, employing a radial basis function and a sigmoid multi-layer perceptron sub-networks, is capable of maximizing such an energy form locally, thus performing almost perfectly nearest neighbor classification, when initiated by a distorted pattern. This paper reviews the proposed design procedure and presents the results of the intensive experimentation of the classifier on random prototypes.


Radial Basis Function Associative Memory Pattern Space Unit Hypercube Radial Basis Function Network Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Muezzinoglu, M.K., Zurada, J.M.: RBF-based neurodynamic nearest neighbor classification in real pattern space (submitted, 2005)Google Scholar
  2. 2.
    Garey, M.R., Johnson, D.S.: Computers and Interactability. W.H. Freeman, New York (1979)Google Scholar
  3. 3.
    Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. National Acad. Sci. 79, 2554–2558 (1982)MathSciNetCrossRefMATHGoogle Scholar
  4. 4.
    Bruck, J., Goodman, J.W.: A generalized convergence theorem for neural networks. IEEE Trans. Information Theory 34, 1089–1092 (1988)MathSciNetCrossRefMATHGoogle Scholar
  5. 5.
    Müezzinoğlu, M.K., Güzeliş, C., Zurada, J.M.: An energy function-based design method for discrete Hopfield associative memory with attractive fixed points. IEEE Trans. Neural Networks 16, 370–378 (2005)CrossRefGoogle Scholar
  6. 6.
    Sanner, R.M., Slotine, J.J.M.: Gaussian networks for direct adaptive control. IEEE Trans. Neural Networks 3, 837–863 (1992)CrossRefGoogle Scholar
  7. 7.
    Billings, S.A., Fung, C.F.: Recurrent radial basis function networks for adaptive noise cancellation. Neural Networks 8, 273–290 (1995)CrossRefGoogle Scholar
  8. 8.
    Poggio, T., Girosi, F.: Networks for approximation and learning. Proceedings of the IEEE 78, 1481–1497 (1990)CrossRefMATHGoogle Scholar
  9. 9.
    Park, J., Sandberg, I.W.: Universal approximation using radial-basis-function networks. Neural Computation 3, 246–257 (1991)CrossRefGoogle Scholar
  10. 10.
    Karayiannis, N.B.: Reformulated radial basis neural networks trained by gradient descent. IEEE Trans. Neural Networks 10, 657–671 (1999)CrossRefGoogle Scholar
  11. 11.
    Khalil, H.K.: Nonlinear Systems, 3rd edn. Prentice-Hall, Englewood Cliffs (2001)Google Scholar
  12. 12.
    Zurada, J.M.: Introduction to Artificial Neural Systems. West, St. Paul (1992)Google Scholar
  13. 13.
    Michel, A.N., Liu, D.: Qualitative Analysis and Sythesis of Recurrent Neural Networks. Marcel Dekker, New York (2002)Google Scholar
  14. 14.
    Shampine, L.F., Reichelt, M.W.: The MATLAB ODE Suite. SIAM Journal on Scientific Computing 18, 1–22 (1997)MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Mehmet Kerem Müezzinoğlu
    • 1
  1. 1.Computational Intelligence Lab.University of LouisvilleLouisvilleU.S.A.

Personalised recommendations