Neural Computing and Applications

, Volume 18, Issue 7, pp 769–779 | Cite as

A hybrid MPSO-BP structure adaptive algorithm for RBFNs

Original Article

Abstract

This paper introduces a novel hybrid algorithm to determine the parameters of radial basis function neural networks (number of neurons, centers, width and weights) automatically. The hybrid algorithm combines the mix encoding particle swarm optimization algorithm with the back propagation (BP) algorithm to form a hybrid learning algorithm (MPSO-BP) for training Radial Basis Function Networks (RBFNs), which adapts to the network structure and updates its weights by choosing a special fitness function. The proposed method is used to deal with three nonlinear problems, and the results obtained are compared with existent bibliography, showing an improvement over the published methods.

Keywords

Particle swarm optimization Mix encoding Radial basis function neural networks Self-adapt 

References

  1. 1.
    Shahsavand A, Ahmadpour A (2005) Application of optimal RBF neural networks for optimization and characterization of porous materials. Comput Chem Eng 29(10):2134–2143. doi:10.1016/j.compchemeng.2005.07.002 CrossRefGoogle Scholar
  2. 2.
    Park J, Sandberg IW (1991) Universal approximation using radial basis functions network. Neural Comput 3(1):246–257. doi:10.1162/neco.1991.3.2.246 CrossRefGoogle Scholar
  3. 3.
    Mu T, Asoke K Nandi, RBF neural networks for solving the inverse problem of backscattering spectra. Neural Comput Appl. doi:10.1007/s00521-007-0138-2
  4. 4.
    Zhang A, Zhang L (2004) RBF neural networks for the prediction of building interference effects. Comput Struc 82:2333–2339. doi:10.1016/j.compstruc.2004.05.014 CrossRefGoogle Scholar
  5. 5.
    Ram D, Srivastava L, Pandit M, Sharma J (2007) Corrective action planning using RBF neural network. Appl Soft Comput 7(3):1055–1063. doi:10.1016/j.asoc.2006.10.007 CrossRefGoogle Scholar
  6. 6.
    Darken C, Moody J (1990) Fast adaptive K-means clustering: some empirical results. Proceedings of IEEE INNS international joint conference on neural networks, pp 233–238Google Scholar
  7. 7.
    Chinrungrueng C, Sequin CH (1995) Optimal adaptive k-means algorithm with dynamic adjustment of learning rate. IEEE Trans Neural Netw 6(3):157–168. doi:10.1109/72.363440 CrossRefGoogle Scholar
  8. 8.
    Haykin S (1994) Neural networks—a comprehensive foundation. IEEE Press, New YorkMATHGoogle Scholar
  9. 9.
    Chen S, Cowan CFN, Grant PM (1991) Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans Neural Netw 2(3):302–309. doi:10.1109/72.80341 CrossRefGoogle Scholar
  10. 10.
    Sherstinsky A, Picard RW (1996) On the efficiency of the orthogonal least squares training method for radial basis function networks. IEEE Trans Neural Netw 7(1):195–200. doi:10.1109/72.478404 CrossRefGoogle Scholar
  11. 11.
    Hassibi B, Stork DG (1993) Second order derivatives for network pruning: optimal brain surgeon. In: Hanson SJ et al (eds) NIPS, vol 5. Morgan Kaufmann, Los Altos, pp 164–172Google Scholar
  12. 12.
    Leonardis A, Bischof H (1998) An efficient MDL-based construction of RBF networks. Neural Netw 11(5):963–973. doi:10.1016/S0893-6080(98)00051-3 CrossRefGoogle Scholar
  13. 13.
    Huang GB, Saratchandran P, Sundararajan N (2004) An efficient sequential learning algorithm for growing and pruning RBF (GAP-RBF) networks. IEEE Trans Syst Man Cybern 34(6):2284–2292. doi:10.1109/TSMCB.2004.834428 CrossRefGoogle Scholar
  14. 14.
    Zhang R, Huang G, Saratchandran P, Sundararajan N (2006) Improved GAP-RBF network for classification problems. Neurocomputing. doi: 10.1016/j.neucom.2006.07.016
  15. 15.
    Alexandridis A, Sarimveis H, Bafas G (2003) A new algorithm for online structure and parameter adaptation of RBF networks. Neural Netw 16(7):1003–1017. doi:10.1016/S0893-6080(03)00052-2 CrossRefGoogle Scholar
  16. 16.
    Staianoa A, Tagliaferria R, Pedryczb W (2006) Improving RBF networks performance in regression tasks by means of a supervised fuzzy clustering. Neurocomputing 69(13–15):1570–1581. doi:10.1016/j.neucom.2005.06.014 CrossRefGoogle Scholar
  17. 17.
    Fritzke B (1994) Fast learning with incremental RBF networks. Neural Process Lett 1(1):2–5. doi:10.1007/BF02312392 CrossRefGoogle Scholar
  18. 18.
    Zhu Q, Cai Y, Liu L (1996) A global learning algorithm for a RBF network. Neural Netw 12(3):527–540. doi:10.1016/S0893-6080(98)00146-4 CrossRefGoogle Scholar
  19. 19.
    Billings SA, Zheng GL (1995) Radial basis function network configuration using genetic algorithms. Neural Netw 8(6):877–890. doi:10.1016/0893-6080(95)00029-Y CrossRefGoogle Scholar
  20. 20.
    Kennedy J, Eberhart R, Shi YH (2001) Swarm intelligence. Morgan Kaufmann, San FranciscoGoogle Scholar
  21. 21.
    Liu W, Wang K (2007) Predicting chaotic time series using hybrid particle swarm optimization algorithm. Contr Decis 22(5):562–565Google Scholar
  22. 22.
    Li XB, Liu D, Zuo L (2007) Application of RBF-PSO in nonlinear calibration for thermocouple sensor. Chin J Sens Actuators 20(4):933–936Google Scholar
  23. 23.
    Feng HM (2006) Self-generation RBFNs using evolutional PSO learning. Neurocomputing 70(1–3):241–251. doi:10.1016/j.neucom.2006.03.007 CrossRefGoogle Scholar
  24. 24.
    Cover TM (1965) Geometrical and statistical properties of systems of linear inequalities with application in pattern recognition. IEEE Trans Electron Comput EC 14(2):326–334. doi:10.1109/PGEC.1965.264136 MATHCrossRefGoogle Scholar
  25. 25.
    Broomhead DS, Lowe D (1988) Multivariable functional interpolation and adaptive networks. Complex Syst 2:321–355MATHMathSciNetGoogle Scholar
  26. 26.
    Karayiannis NB (1999) Reformulated radial basis neural networks trained by gradient descent. IEEE Trans Neural Netw 10(3):657–671. doi:10.1109/72.761725 CrossRefGoogle Scholar
  27. 27.
    Moody J, Darken C (1989) Faster learning in networks of locally tuned processing units. Neural Comput 1(3):281–294. doi:10.1162/neco.1989.1.2.281 CrossRefGoogle Scholar
  28. 28.
    Haykin S (1999) Neural networks: a comprehensive foundation, 2nd edn. Prentice Hall, New JerseyMATHGoogle Scholar
  29. 29.
    Eberhart R, Kennedy J (1995) Particle swarm optimization. IEEE Int Conf Neural Netw IV:1942–1947Google Scholar
  30. 30.
    Clerc M (1999) The swarm and the queen: towards a deterministic and adaptive particle swarm optimization. In: Proceeding congress on evolutionary computation, Washington DC, pp 1951–1957Google Scholar
  31. 31.
    Shi YH, Eberhart R (1998) A modified particle swarm optimizer. In: IEEE international conference on evolutionary computation, pp 69–73Google Scholar
  32. 32.
    Kennedy J, Eberhart R (1997) A discrete binary version of the particle swarm algorithm. In: Proceedings of the World multiconference on systemic, cybernetics and informatics, Piscataway, NJ, pp 4104–4109Google Scholar
  33. 33.
    Wong CC, Chen CC (1999) A hybrid clustering and gradient descent approach for fuzzy modeling. IEEE Trans Syst Man Cybern 29(6):686–693Google Scholar
  34. 34.
    Narendra KS, Parthasarathy K (1990) Identification and control of dynamical systems using neural networks. IEEE Trans Neural Netw 1(2):4–27. doi:10.1109/72.80202 CrossRefGoogle Scholar
  35. 35.
    Liu GP, Kadirkamanathan V, Billings SA (1998) Online identification of nonlinear systems using Volterra polynomial basis function neural networks. Neural Netw 11(9):1645–1657. doi:10.1016/S0893-6080(98)00100-2 CrossRefGoogle Scholar
  36. 36.
    Machine Learning Repository UCI. http://www.ics.uci.edu/_mlearn/MLRepository.html

Copyright information

© Springer-Verlag London Limited 2008

Authors and Affiliations

  1. 1.School of Economics and ManagementChina University of GeosciencesWuhanPeople’s Republic of China

Personalised recommendations