Advertisement

A Parsimonious Radial Basis Function-Based Neural Network for Data Classification

  • Shing Chiang Tan
  • Chee Peng Lim
  • Junzo Watada
Chapter
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 42)

Abstract

The radial basis function neural network trained with a dynamic decay adjustment (known as RBFNDDA) algorithm exhibits a greedy insertion behavior as a result of recruiting many hidden nodes for encoding information during its training process. In this chapter, a new variant RBFNDDA is proposed to rectify such deficiency. Specifically, the hidden nodes of RBFNDDA are re-organized through the supervised Fuzzy ARTMAP (FAM) classifier, and the parameters of these nodes are adapted using the Harmonic Means (HM) algorithm. The performance of the proposed model is evaluated empirically using three benchmark data sets. The results indicate that the proposed model is able to produce a compact network structure and, at the same time, to provide high classification performances.

Keywords

Radial basis function neural network Adaptive resonance theory Harmonic mean algorithm Classification 

References

  1. 1.
    Bache, K., Lichman, M.: UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. University of California, School of Information and Computer Science, Irvine (2013)
  2. 2.
    Berthold, M.R., Diamond, J.: Constructive training of probabilistic neural networks. Neurocomputing 19(1–3), 167–183 (1998)CrossRefGoogle Scholar
  3. 3.
    Bortman, M., Aladjem, M.: A growing and pruning method for radial basis function networks. IEEE Trans. Neural Netw. 20(6), 1039–1045 (2009)CrossRefGoogle Scholar
  4. 4.
    Carpenter, G.A., Grossberg, S., Markuzon, N., Reynolds, J.H., Rosen, D.B.: Fuzzy artmap: a neural network architecture for incremental supervised learning of analog multidimensional maps. IEEE Trans. Neural Netw. 3(5), 698–713 (1992)CrossRefGoogle Scholar
  5. 5.
    Carpenter, G.A., Milenova, B.L., Noeske, B.W.: Distributed artmap: a neural network for fast distributed supervised learning. Neural Netw. 11(5), 793–813 (1998)CrossRefGoogle Scholar
  6. 6.
    Dogantekin, E., Dogantekin, A., Avci, D.: An automatic diagnosis system based on thyroid gland: ADSTG. Expert Syst. Appl. 37(9), 6368–6372 (2010)CrossRefGoogle Scholar
  7. 7.
    Efron, B.: Bootstrap methods: another look at the jackknife. Ann. Stat. 7(1), 1–26 (1979)CrossRefMathSciNetMATHGoogle Scholar
  8. 8.
    Fernández-Navarro, F., Hervás-Martínez, C., Gutiérrez, P.A., Peña-Barragán, J.M., López-Granados, F.: Parameter estimation of \(q\)-gaussian radial basis functions neural networks with a hybrid algorithm for binary classification. Neurocomputing 75(1), 123–134 (2012)CrossRefGoogle Scholar
  9. 9.
    Han, H.-G., Qiao, J.-F.: A structure optimisation algorithm for feedforward neural network construction. Neurocomputing 99, 347–357 (2013)CrossRefGoogle Scholar
  10. 10.
    Hudak, M.H.: RCE classifiers: theory and practice. Cybern. Syst. 23(5), 483–515 (1992)CrossRefMathSciNetGoogle Scholar
  11. 11.
    Keleş, A., Keleş, A.: Expert system for thyroid disease diagnosis. Expert Syst. Appl. 34(1), 242–246 (2008)CrossRefGoogle Scholar
  12. 12.
    Kohavi, R.: A study of cross validation and bootstrap for accuracy estimation and model selection. In: Proceedings of the 14th International Joint Conference Artificial Intelligence (IJCAI), pp. 1137–1145. Morgan Kaufmann (1995)Google Scholar
  13. 13.
    Martínez-Rego, D., Pérez-Sánchez, B., Fontenla-Romero, O., Alonso-Betanzos, A.: A robust incremental learning method for non-stationary environments. Neurocomputing 74(11), 1800–1808 (2011)CrossRefGoogle Scholar
  14. 14.
    Ni, J., Song, Q.: Dynamic pruning algorithm for multilayer perceptron based neural control systems. Neurocomputing 69(16–18), 2097–2111 (2006)CrossRefGoogle Scholar
  15. 15.
    Paetz, J.: Reducing the number of neurons in radial basis function networks with dynamic decay adjustment. Neurocomputing 62, 79–91 (2004)CrossRefGoogle Scholar
  16. 16.
    Polat, K., Gunes, S.: An expert system approach based on principal component analysis and adaptive neuro-fuzzy inference system to diagnosis of diabetes disease. Digit. Signal Proc. 17(4), 702–710 (2007)CrossRefGoogle Scholar
  17. 17.
    Reed, R.: Pruning algorithms—a survey. IEEE Trans. Neural Netw. 4(5), 740–747 (1993)CrossRefGoogle Scholar
  18. 18.
    Reilly, D.L., Cooper, L.N., Elbaum, C.: A neural model for category learning. Biol. Cybern. 45(1), 35–41 (1982)CrossRefGoogle Scholar
  19. 19.
    Specht, D.F.: Probabilistic neural networks. Neural Netw. 3(1), 109–118 (1990)CrossRefGoogle Scholar
  20. 20.
    Widrow, B., Greenblatt, A., Kim, Y., Park, D.: The no-prop algorithm: a new learning algorithm for multilayer neural networks. Neural Netw. 37, 182–188 (2013)CrossRefGoogle Scholar
  21. 21.
    Xu, R., Wunsch II, D.C.: BARTMAP: a viable structure for biclustering. Neural Netw. 24(7), 709–716 (2011)CrossRefGoogle Scholar
  22. 22.
    Yang, S.-H., Chen, Y.-P.: An evolutionary constructive and pruning algorithm for artificial neural networks and its prediction applications. Neurocomputing 86, 140–149 (2012)CrossRefGoogle Scholar
  23. 23.
    Yang, L., Yang, S., Zhang, R., Jin, H.: Sparse least square support vector machine via coupled compressive pruning. Neurocomputing 131, 77–86 (2014)CrossRefGoogle Scholar
  24. 24.
    Zhang, B.: Generalized k-harmonic mean—boosting in unsupervised learning. Technical Report HPL-2000-137, Hewlett-Packard Laboratories (2000)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Shing Chiang Tan
    • 1
  • Chee Peng Lim
    • 2
  • Junzo Watada
    • 3
  1. 1.Multimedia UniversityCyberjayaMalaysia
  2. 2.Centre for Intelligent Systems Research, Deakin UniversityGeelongAustralia
  3. 3.Waseda UniversityTokyoJapan

Personalised recommendations