Skip to main content

A Revisit of Reducing Hidden Nodes in a Radial Basis Function Neural Network with Histogram

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2018)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11302))

Included in the following conference series:

  • 2158 Accesses

Abstract

In previous work [1], an incremental radial basis function network trained by a dynamic decay adjustment algorithm (RBFNDDA) was integrated with histogram to reduce redundant hidden neurons (or simply neurons). In order to remove unnecessary neurons, a weight-based indicator was utilized [1]. This hybrid model (RBFNDDA-HIST1) can reduce unnecessary neurons and maintain classification accuracy satisfactorily. However, another aspect of noises, i.e., overlapping among neurons of different classes in RBFNDDA-HIST1 and RBFNDDA, is not tackled fully for solutions. To close this research gap, another version of RBFNDDA-HIST (i.e., RBFNDDA-HISTR) is developed whereby the radius of a neuron (that overlaps with neurons of other classes) is checked before removing it from the network. Several public data sets that have a high level of overlapping records according to an overlapping indicator are used to evaluate the performance of RBFNDDA-HISTR in terms of number of neurons and classification accuracy. A performance comparison among RBFNDDA, RBFNDDA-HISTR and RBFNDDA-HIST1 are made. The results show that the proposed RBFNDDA-HISTR can reduce the number of neurons from RBFNDDA-HIST1 without deteriorating classification accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Goh, P.Y., Tan, S.C., Cheah, W.P., Lim, C.P.: Reducing the complexity of an adaptive radial basis function network with a histogram algorithm. Neural Comput. Appl. 28(1), 365–378 (2017)

    Article  Google Scholar 

  2. Du, K.-L., Swamy, M.N.S.: Neural Network and Statistical Learning. Springer-Verlag, London (2014). https://doi.org/10.1007/978-1-4471-5571-3

    Book  MATH  Google Scholar 

  3. Robnik-Šikonja, M.: Data generators for learning systems based on RBF networks. IEEE Trans. Neural Networks Learn. Syst. 27(5), 926–938 (2016)

    Article  MathSciNet  Google Scholar 

  4. Kurjakovic, S., Svennberg, A.: Implementing an RBF in VHDL, Kungl Tekniska Hogskolan. Royal Institute of Technology), Swedish (2002)

    Google Scholar 

  5. Bezerra, M.E.R., Oliveiray, A.L.I., Adeodatoz, P.J.L.: Predicting software defects: a cost-sensitive approach. In: 2011 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 2015–2522. IEEE, Anchorage (2011)

    Google Scholar 

  6. Paetz, J.: Reducing the number of neurons in radial basis function networks with dynamic decay adjustment. Neurocomputing 62, 79–91 (2004)

    Article  Google Scholar 

  7. Reed, R.: Pruning algorithms-a survey. IEEE Trans. Neural Networks 4(5), 740–747 (1993)

    Article  Google Scholar 

  8. Wang, J., Xu, C., Yang, X., Zurada, J.M.: A novel pruning algorithm for smoothing feed forward neural networks based on group lasso method. IEEE Trans. Neural Networks Learn. Syst. 29(5), 2012–2024 (2018)

    Article  MathSciNet  Google Scholar 

  9. Zhai, J., Shao, Q., Wang, X.: Architecture selection of ELM networks based on sensitivity of hidden nodes. Neural Process. Lett. 44(2), 471–489 (2016)

    Article  Google Scholar 

  10. Wang, H., Yan, X.: Improved simple deterministically constructed cycle reservoir network with sensitive iterative pruning algorithm. Neurocomputing 145, 353–362 (2014)

    Article  Google Scholar 

  11. Medeiros, C.S., Barreto, G.: A novel weight pruning method for MLP classifiers based on the MAXCORE principle. Neural Comput. Appl. 22(1), 71–84 (2013)

    Article  Google Scholar 

  12. Leung, C., Tsoi, A.: Combined learning and pruning for recurrent radial basis function networks based on recursive least square algorithms. Neural Comput. Appl. 15(1), 62–78 (2005)

    Article  Google Scholar 

  13. Kusy, M., Kluska, J.: Probabilistic neural network structure reduction for medical data classification. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, Lotfi A., Zurada, Jacek M. (eds.) ICAISC 2013. LNCS (LNAI), vol. 7894, pp. 118–129. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-38658-9_11

    Chapter  Google Scholar 

  14. Xie, Y., Fan, X., Chen, J.: Affinity propagation-based probability neural network structure optimization. In: 2014 Tenth International Conference on Computational Intelligence and Security, pp. 85–89. IEEE, Kunming (2014)

    Google Scholar 

  15. Xing, H.J., Hu, B.G.: Two-phase construction of multilayer perceptrons using information theory. IEEE Trans. Neural Netw. 20(4), 715–721 (2009)

    Article  MathSciNet  Google Scholar 

  16. Deco, G., Finnoff, W., Zimmermann, H.G.: Elimination of overtraining by a mutual information network. In: Gielen, S., Kappen, B. (eds.) ICANN ’93, pp. 744–749. Springer, London (1993). https://doi.org/10.1007/978-1-4471-2063-6_208

    Chapter  Google Scholar 

  17. Thomas, P., Suhner, M.-C.: A new multilayer perceptron pruning algorithm for classification and regression applications. Neural Process. Lett. 42(2), 437–458 (2015)

    Article  Google Scholar 

  18. Lo, J.T.: Pruning method of pruning neural networks. In: International Joint Conference on Neural Networks, IJCNN 1999, Washington, DC, vol. 3, pp. 1678–1680 (1999)

    Google Scholar 

  19. Ioannidis, Y.: The history of histogram. In: Freytag, J-C., Lockemann, P., Abiteboul, S., Carey, M., Selinger, P., Heuer, A. (eds.) Proceedings of the 29th International Conference on Very Large Data Bases, Berlin, Germany, vol. 29, pp. 19–30 (2003)

    Google Scholar 

  20. Martín-Albo, D., Romero, V., Vidal, E.: An experimental study of pruning techniques in handwritten text recognition systems. In: Sanches, João M., Micó, L., Cardoso, Jaime S. (eds.) IbPRIA 2013. LNCS, vol. 7887, pp. 559–566. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-38628-2_66

    Chapter  Google Scholar 

  21. Steinbiss, V., Tran, B.-H., Ney, H.: Improvement in beam search. In: Third International Conference on Spoken Language Processing, Yokohama, Japan, pp. 2143–2146 (1994)

    Google Scholar 

  22. Zhang, W.-Q., Liu, J.: Two-stage method for specific audio retrieval. In: IEEE International Conference on Acoustics, Speech and Signal Processing, Honolulu, HI, vol. 4, pp. IV–85–IV–88 (2007)

    Google Scholar 

  23. Kashino, K., Kurozumi, T., Murase, H.: A quick search method for audio and video signals based on histogram pruning. IEEE Trans. Multimed. 5(3), 348–357 (2003)

    Article  Google Scholar 

  24. Tang, W., Mao, K.Z., Mak, L.O., Ng, G.W.: Classification for overlapping classes using optimized overlapping region detection and soft decision. In: 13th International Conference on Information Fusion, pp. 1–8. IEEE, Edinburgh (2010)

    Google Scholar 

  25. Berthold, M.R., Diamond, J.: Boosting the performance of RBF networks with dynamic decay adjustment. In: NIPS 1994 Proceedings of the 7th International Conference on Neural Information Processing Systems, pp. 521–528. MIT Press, Cambridge (1995)

    Google Scholar 

  26. Shimazaki, H., Shinomoto, S.: A method for selecting the bin size of a time histogram. J. Neural Comput. 19(6), 1503–1527 (2007)

    Article  MathSciNet  Google Scholar 

  27. Hayashi, Y., Tsuji, H., Saga, R.: Visualizing method based on item sales records and its experimentation. In: 2009 IEEE International Conference on Systems, Man and Cybernetics, pp. 524–528. IEEE, San Antonio (2009)

    Google Scholar 

  28. Bache, K., Lichman, M.: UCI Machine Learning Repository, University of California, School of Information and Computer Science (2013). http://archive.ics.uci.edu/ml

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pey Yun Goh .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Goh, P.Y., Tan, S.C., Cheah, W.P. (2018). A Revisit of Reducing Hidden Nodes in a Radial Basis Function Neural Network with Histogram. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11302. Springer, Cham. https://doi.org/10.1007/978-3-030-04179-3_64

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-04179-3_64

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-04178-6

  • Online ISBN: 978-3-030-04179-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics