Neural Processing Letters

, Volume 35, Issue 2, pp 177–186

A Novel Structure for Radial Basis Function Networks—WRBF

Article

DOI: 10.1007/s11063-011-9210-0

Cite this article as:
Khosravi, H. Neural Process Lett (2012) 35: 177. doi:10.1007/s11063-011-9210-0

Abstract

A novel structure for radial basis function networks is proposed. In this structure, unlike traditional RBF, we set some weights between input and hidden layer. These weights, which take values around unity, are multiplication factors for input vector and perform a linear mapping. Doing this, we increase free parameters of the network, but since these weights are trainable, the overall performance of the network is improved significantly. According to the new weight vector, we called this structure Weighted RBF or WRBF. Weight adjustment formula is provided by applying the gradient descent algorithm. Two classification problems used to evaluate performance of the new RBF network: letter classification using UCI dataset with 16 features, a difficult problem, and digit recognition using HODA dataset with 64 features, an easy problem. WRBF is compared with classic RBF and MLP network, and our experiments show that WRBF outperforms both significantly. For example, in the case of 200 hidden neurons, WRBF achieved recognition rate of 92.78% on UCI dataset while RBF and MLP achieved 83.13 and 89.25% respectively. On HODA dataset, WRBF reached 97.94% recognition rate whereas RBF achieved 97.14%, and MLP accomplished 97.63%.

Keywords

Radial basis Neural network RBF WRBF Classification Gradient descent 

Copyright information

© Springer Science+Business Media, LLC. 2011

Authors and Affiliations

  1. 1.Department of Electrical and Robotic EngineeringShahrood University of TechnologyShahroodIran

Personalised recommendations