Neural Processing Letters

, Volume 35, Issue 2, pp 177–186 | Cite as

A Novel Structure for Radial Basis Function Networks—WRBF

  • Hossein Khosravi


A novel structure for radial basis function networks is proposed. In this structure, unlike traditional RBF, we set some weights between input and hidden layer. These weights, which take values around unity, are multiplication factors for input vector and perform a linear mapping. Doing this, we increase free parameters of the network, but since these weights are trainable, the overall performance of the network is improved significantly. According to the new weight vector, we called this structure Weighted RBF or WRBF. Weight adjustment formula is provided by applying the gradient descent algorithm. Two classification problems used to evaluate performance of the new RBF network: letter classification using UCI dataset with 16 features, a difficult problem, and digit recognition using HODA dataset with 64 features, an easy problem. WRBF is compared with classic RBF and MLP network, and our experiments show that WRBF outperforms both significantly. For example, in the case of 200 hidden neurons, WRBF achieved recognition rate of 92.78% on UCI dataset while RBF and MLP achieved 83.13 and 89.25% respectively. On HODA dataset, WRBF reached 97.94% recognition rate whereas RBF achieved 97.14%, and MLP accomplished 97.63%.


Radial basis Neural network RBF WRBF Classification Gradient descent 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Knudsen EI (1994) Supervised Learning in the Brain. J Neurosci 14(7): 3985–3997Google Scholar
  2. 2.
    Maass W (1997) Networks of spiking neurons: the third generation of neural network models. Neural Netw 10: 1659–1671CrossRefGoogle Scholar
  3. 3.
    Izhikevich EM (2004) Which model to use for cortical spiking neuron?. IEEE Trans Neural Netw 15(5): 1063–1070CrossRefGoogle Scholar
  4. 4.
    da Silva AB, Rosa JLG (2011) Advances on criteria for biological plausibility in artificial neural networks: think of learning processes. In: International joint conference on neural networks, CaliforniaGoogle Scholar
  5. 5.
    Kohonen T (1988) An introduction to neural computing. Neural Netw 1: 4CrossRefGoogle Scholar
  6. 6.
    Huang SH, Zhang H-C (1994) Artificial neural networks in manufacturing: concepts, applications, and perspectives. IEEE Trans Compon Packag Manuf Technol 17(2): 212–228CrossRefGoogle Scholar
  7. 7.
    Balabin RM, Safieva RZ (2008) Motor oil classification by base stock and viscosity based on near infrared (NIR) spectroscopy data. Fuel 87: 2745–2752CrossRefGoogle Scholar
  8. 8.
    Khosravi H, Kabir E (2010) Farsi font recognition based on Sobel–Roberts features. Pattern Recogn Lett 31: 75–82CrossRefGoogle Scholar
  9. 9.
    Xiaohu L et al (2011) A new multilayer feedforward small-world neural network with its performances on function approximation. In: IEEE Int’l Conf. on computer science and automation engineering, pp 353–357Google Scholar
  10. 10.
    Balabin RM, Lomakina EI (2009) Neural network approach to quantum-chemistry data: accurate prediction of density functional theory energies. J Chem Phys 131(7): 1041–1048CrossRefGoogle Scholar
  11. 11.
    Bashkirov OA, Braverman EM, Muchnik IB (1964) Potential function algorithms networks for pattern recognition learning machines. Autom Remote Control, pp 629–631Google Scholar
  12. 12.
    Hojjatoleslami A, Sardo L, Kittler J (1997) An RBF based classifier for the detection of microcalcifications in mammograms with outlier rejection capability In: International conference on neural networks, pp 379–1384Google Scholar
  13. 13.
    Wang D et al (2002) Protein sequences classification using radial basis function (RBF) neural networks. In: 9’th international conference on neural information processing, Singapore, pp 764–769Google Scholar
  14. 14.
    Chen S et al (2008) Symmetric RBF classifier for nonlinear detection in multiple-antenna-aided systems. IEEE Tran Neural Netw 19(5): 737–745CrossRefGoogle Scholar
  15. 15.
    Meng K et al (2010) A self-adaptive RBF neural network classifier for transformer fault analysis. IEEE Trans Power Syst 25(3): 1350–1360CrossRefGoogle Scholar
  16. 16.
    Vakil-Baghmisheh M-T, Pavesic N (2004) Training RBF networks with selective backpropagation. Neurocomputing 62: 39–64CrossRefGoogle Scholar
  17. 17.
    King JL, Reznik L (2006) Topology selection for signal change detection in sensor networks: RBF vs. MLP. In: International joint conference on neural networks, pp 2529–2535Google Scholar
  18. 18.
    Polat G, Altun H (2007) Evalutation of performance of KNN, MLP and RBF classifiers in emotion detection problem. In: 15th conference on signal processing and communications applications, vol 1, pp 1–4Google Scholar
  19. 19.
    Haykin S (2009) Neural networks and learning machines, 3rd edn. Prentice-Hall, Upper Saddle RiverGoogle Scholar
  20. 20.
    Werbos PJ (1974) Beyond regression: new tools for prediction and analysis in the behavioral science. PhD thesis, Harvard UniversityGoogle Scholar
  21. 21.
    Parker DB (1982) Learning logic. Invention Report S81-64, File 1, Oce of Technology Licensing. Stanford UniversityGoogle Scholar
  22. 22.
    Ciocoiu IB (2002) RBF networks training using a dual extended Kalman filter. Neurocomputing 48: 609–622zbMATHCrossRefGoogle Scholar
  23. 23.
    Abe Y, Iiguni Y (2006) Fast computation of RBF coefficients using FFT. Sig Process 86(11): 3264–3274zbMATHCrossRefGoogle Scholar
  24. 24.
    Ho K, Leung C-s, Sum J (2011) Training RBF network to tolerate single node fault. Neurocomputing 74(6): 1046–1052CrossRefGoogle Scholar
  25. 25.
    Sanchez AD (2002) Searching for a solution to the automatic RBF network design problem. Neurocomputing 42: 147–170zbMATHCrossRefGoogle Scholar
  26. 26.
    Zhang R et al (2007) Improved GAP-RBF network for classification problems. Neurocomputing 70(16–18): 3011–3018CrossRefGoogle Scholar
  27. 27.
    Tao J, Wang N (2007) Splicing system based genetic algorithms for developing rbf networks models. Chin J Chem Eng 15(2): 240–246CrossRefGoogle Scholar
  28. 28.
    Sug H (2010) Generating better radial basis function network for large data set of census. Int J Softw Eng Appl 4(2): 15–22Google Scholar
  29. 29.
    Zhang Q, Benveniste A (1992) Wavelet networks. IEEE Trans Neural Netw 3(6): 889–898CrossRefGoogle Scholar
  30. 30.
    Balabin RM, Safieva RZ, Lomakina EI (2008) Wavelet neural network (WNN) approach for calibration model building based on gasoline near infrared (NIR) spectra. Chemometr Intell Lab Syst 93: 58–62CrossRefGoogle Scholar
  31. 31.
    Khosravi H, Kabir E (2007) Introducing a very large dataset of handwritten Farsi digits and a study on their varieties. Pattern Recogn Lett 28(10): 1133–1141CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC. 2011

Authors and Affiliations

  1. 1.Department of Electrical and Robotic EngineeringShahrood University of TechnologyShahroodIran

Personalised recommendations