TWRBF – Transductive RBF Neural Network with Weighted Data Normalization

  • Qun Song
  • Nikola Kasabov
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3316)

Abstract

This paper introduces a novel RBF model – Transductive Radial Basis Function Neural Network with Weighted Data Normalization (TWRBF). In transductive systems a local model is developed for every new input vector, based on some closest to this vector data from the training data set. The Weighted Data Normalization method (WDN) optimizes the data normalization range individually for each input variable of the system. A gradient descent algorithm is used for training the TWRBF model. The TWRBF is illustrated on two case study prediction/identification problems. The first one is a prediction problem of the Mackey-Glass time series and the second one is a real medical decision support problem of estimating the level of renal functions in patients. The proposed TWRBF method not only gives a good accuracy for an individual, “personalized” model, but depicts the most significant variables for this model.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Fuzzy System Toolbox, MATLAB Inc., Ver. 2 (2002)Google Scholar
  2. 2.
    Jang, R.: ANFIS: adaptive network-based fuzzy inference system. IEEE Trans. on Syst., Man, and Cybernetics 23(3), 665–685 (1993)CrossRefMathSciNetGoogle Scholar
  3. 3.
    Kasabov, N.: Evolving fuzzy neural networks for on-line supervised/unsupervised, knowledge based learning. IEEE Trans. SMC part B, Cybernetics 31(6), 902–918 (2001)CrossRefGoogle Scholar
  4. 4.
    Kasabov, N.: Evolving connectionist systems: Methods and Applications in Bioinformatics, Brain study and intelligent machines. Springer, Heidelberg (2002)Google Scholar
  5. 5.
    Kasabov, N., Song, Q.: DENFIS: Dynamic, evolving neural-fuzzy inference systems and its application for time-series prediction. IEEE Trans. on Fuzzy Systems 10, 144–154 (2002)CrossRefGoogle Scholar
  6. 6.
    Lee, Y.: Handwritten digit recognition using K nearest-neighbor, radial basis function, and back-propagation neural networks. Neural Computation 3(3), 440–449 (1991)CrossRefGoogle Scholar
  7. 7.
    Levey, A.S., Bosch, J.P., Lewis, J.B., Greene, T., Rogers, N., Roth, D.: For the Modification of Diet in Renal Disease Study Group, A More Accurate Method To Estimate Glomerular Filtration Rate from Serum Creatinine: A New Prediction Equation. Annals of Internal Medicine 130, 461–470 (1999)Google Scholar
  8. 8.
    Mackey, M.C., Glass, L.: Oscillation and Chaos in Physiological Control systems. Science 197, 287–289 (1977)CrossRefGoogle Scholar
  9. 9.
    Neural Network Toolbox user’s guide, The MATH WORKS Inc., Ver. 4 (2002)Google Scholar
  10. 10.
    Poggio, F.: Regularization theory, radial basis functions and networks. In: From Statistics to Neural Networks: Theory and Pattern Recognition Applications. NATO ASI Series 136, 83–104 (1994)Google Scholar
  11. 11.
    Song, Q., Kasabov, N.: ECM - A Novel On-line, Evolving Clustering Method and Its Applications. In: Proceedings of the Fifth Biannual Conference on Artificial Neural Networks and Expert Systems (ANNES2001), pp. 87–92 (2001)Google Scholar
  12. 12.
    Song, Q., Kasabov, N.: Weighted Data Normalizations and feature Selection for Evolving Connectionist Systems Proceedings. In: Proc. of The Eighth Australian and New Zealand Intelligence Information Systems Conference (ANZIIS 2003), pp. 285–290 (2003)Google Scholar
  13. 13.
    Vapnik, V.: Statistical Learning Theory. John Wiley & Sons, Inc., Chichester (1998)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Qun Song
    • 1
  • Nikola Kasabov
    • 1
  1. 1.Knowledge Engineering & Discovery Research InstituteAuckland University of TechnologyAucklandNew Zealand

Personalised recommendations