Advertisement

Neural Computing and Applications

, Volume 23, Issue 7–8, pp 1969–1976 | Cite as

Kernelized LARS–LASSO for constructing radial basis function neural networks

  • Quan Zhou
  • Shiji SongEmail author
  • Cheng Wu
  • Gao Huang
ISNN2012

Abstract

Model structure selection is of crucial importance in radial basis function (RBF) neural networks. Existing model structure selection algorithms are essentially forward selection or backward elimination methods that may lead to sub-optimal models. This paper proposes an alternative selection procedure based on the kernelized least angle regression (LARS)–least absolute shrinkage and selection operator (LASSO) method. By formulating the RBF neural network as a linear-in-the-parameters model, we derive a l 1-constrained objective function for training the network. The proposed algorithm makes it possible to dynamically drop a previously selected regressor term that is insignificant. Furthermore, inspired by the idea of LARS, the computing of output weights in our algorithm is greatly simplified. Since our proposed algorithm can simultaneously conduct model structure selection and parameter optimization, a network with better generalization performance is built. Computational experiments with artificial and real world data confirm the efficacy of the proposed algorithm.

Keywords

Kernelized LARS–LASSO Radial basis function (RBF) Mode structure selection 

Notes

Acknowledgments

This paper is supported by the National Natural Science Foundation of China under Grant 61273233, the Research Foundation for the Doctoral Program of Higher Education under Grant 20090002110035, the Project of China Ocean Association under Grant DY125-25-02, and Tsinghua University Initiative Scientific Research Program under Grants 2010THZ07002 and 2011THZ07132.

References

  1. 1.
    Adeney K, Korenberg M (2000) Iterative fast orthogonal search algorithm for MDL-based training of generalized single-layer networks. Neural Netw 13(7):787–799CrossRefGoogle Scholar
  2. 2.
    Akaike H (1974) A new look at the statistical model identification. IEEE Trans Autom Control 19(6):716–723MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Barreto A, Barbosa H, Ebecken N (2006) Golsgenetic orthogonal least squares algorithm for training rbf networks. Neurocomputing 69(16):2041–2064CrossRefGoogle Scholar
  4. 4.
    Chang M, Lin C (2005) Leave-one-out bounds for support vector regression model selection. Neural Comput 17(5):1188–1222MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Chen S, Cowan C, Grant P (1991) Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans Neural Netw 2(2):302–309CrossRefGoogle Scholar
  6. 6.
    Chu W, Keerthi S, Ong C (2004) Bayesian support vector regression using a unified loss function. IEEE Trans Neural Netw 15(1):29–44CrossRefGoogle Scholar
  7. 7.
    Deng J, Li K, Irwin G (2012) Locally regularised two-stage learning algorithm for RBF network centre selection. Int J Syst Sci 43(6):1157–1170MathSciNetCrossRefGoogle Scholar
  8. 8.
    Efron B, Hastie T, Johnstone I, Tibshirani R (2004) Least angle regression. Ann Stat 32(2):407–499MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Frank A, Asuncion A (2010) UCI machine learning repository. URL: http://archive.ics.uci.edu/ml10
  10. 10.
    Furnival G, Wilson R Jr (1974) Regressions by leaps and bounds. Technometrics 16(4):499–511Google Scholar
  11. 11.
    Gomm J, Yu D (2000) Selecting radial basis function network centers with recursive orthogonal least squares training. IEEE Trans Neural Netw 11(2):306–314CrossRefGoogle Scholar
  12. 12.
    Hong X, Mitchell R, Chen S, Harris C, Li K, Irwin G (2008) Model selection approaches for non-linear system identification: a review. Int J Syst Sci 39(10):925–946MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Huang G, Song S, Wu C (2012) Orthogonal least squares algorithm for training cascade neural networks. IEEE Trans Circuits Syst I Regul Pap 59(12). doi: 10.1109/TCSI.2012.2189060
  14. 14.
    Li K, Peng J (2006) System oriented neural networks-problem formulation, methodology and application. Int J Pattern Recogn Artif Intell 20:143–158CrossRefGoogle Scholar
  15. 15.
    Li K, Peng J, Bai E (2009) Two-stage mixed discrete–continuous identification of radial basis function (RBF) neural models for nonlinear systems. IEEE Trans Circuits Syst I Regul Pap 56(3):630–643MathSciNetCrossRefGoogle Scholar
  16. 16.
    Liu Z, Li W, Sun W (2011) A novel method of short-term load forecasting based on multiwavelet transform and multiple neural networks. Neural Comput Appl 1–7. doi: 10.1007/s00521-011-0715-2
  17. 17.
    Park J, Sandberg I (1991) Universal approximation using radial-basis-function networks. Neural Comput 3(2):246–257CrossRefGoogle Scholar
  18. 18.
    Peng J, Li K, Huang D (2006) A hybrid forward algorithm for rbf neural network construction. IEEE Trans Neural Netw 17(6):1439–1451CrossRefGoogle Scholar
  19. 19.
    Qian D, Zhao D, Yi J, Liu X (2011) Neural sliding-mode load frequency controller design of power systems. Neural Comput Appl 1–8. doi: 10.1007/s00521-011-0709-0
  20. 20.
    Rosset S, Zhu J (2007) Piecewise linear regularized solution paths. Ann Stat 35(3):1012–1030MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Sheikhan M, Pardis R, Gharavian D (2012) State of charge neural computational models for high energy density batteries in electric vehicles. Neural Comput Appl 1–10. doi: 10.1007/s00521-012-0883-8
  22. 22.
    Sherstinsky A, Picard R (1996) On the efficiency of the orthogonal least squares training method for radial basis function networks. IEEE Trans Neural Netw 7(1):195–200CrossRefGoogle Scholar
  23. 23.
    Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc Ser B (Methodol) 58(1):267–288Google Scholar
  24. 24.
    Tibshirani R (2011) Regression shrinkage and selection via the lasso: a retrospective. J R Stat Soc Ser B (Methodol) 73(3):273–282MathSciNetCrossRefGoogle Scholar
  25. 25.
    Wang G, Yeung D, Lochovsky F (2007) The kernel path in kernelized lasso. In: International conference on artificial intelligence and statistics, pp 580–587Google Scholar
  26. 26.
    Zheng G, Billings S (1996) Radial basis function network configuration using mutual information and the orthogonal least squares algorithm. Neural Netw 9(9):1619–1637CrossRefGoogle Scholar
  27. 27.
    Zou H, Hastie T, Tibshirani R (2007) On the degrees of freedom of the lasso. Ann Stat 35(5):2173–2192MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag London 2012

Authors and Affiliations

  1. 1.Department of Automation, TNListTsinghua UniversityBeijingChina

Personalised recommendations