Skip to main content
Log in

Kernelized LARS–LASSO for constructing radial basis function neural networks

  • ISNN2012
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Model structure selection is of crucial importance in radial basis function (RBF) neural networks. Existing model structure selection algorithms are essentially forward selection or backward elimination methods that may lead to sub-optimal models. This paper proposes an alternative selection procedure based on the kernelized least angle regression (LARS)–least absolute shrinkage and selection operator (LASSO) method. By formulating the RBF neural network as a linear-in-the-parameters model, we derive a l 1-constrained objective function for training the network. The proposed algorithm makes it possible to dynamically drop a previously selected regressor term that is insignificant. Furthermore, inspired by the idea of LARS, the computing of output weights in our algorithm is greatly simplified. Since our proposed algorithm can simultaneously conduct model structure selection and parameter optimization, a network with better generalization performance is built. Computational experiments with artificial and real world data confirm the efficacy of the proposed algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Adeney K, Korenberg M (2000) Iterative fast orthogonal search algorithm for MDL-based training of generalized single-layer networks. Neural Netw 13(7):787–799

    Article  Google Scholar 

  2. Akaike H (1974) A new look at the statistical model identification. IEEE Trans Autom Control 19(6):716–723

    Article  MathSciNet  MATH  Google Scholar 

  3. Barreto A, Barbosa H, Ebecken N (2006) Golsgenetic orthogonal least squares algorithm for training rbf networks. Neurocomputing 69(16):2041–2064

    Article  Google Scholar 

  4. Chang M, Lin C (2005) Leave-one-out bounds for support vector regression model selection. Neural Comput 17(5):1188–1222

    Article  MathSciNet  MATH  Google Scholar 

  5. Chen S, Cowan C, Grant P (1991) Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans Neural Netw 2(2):302–309

    Article  Google Scholar 

  6. Chu W, Keerthi S, Ong C (2004) Bayesian support vector regression using a unified loss function. IEEE Trans Neural Netw 15(1):29–44

    Article  Google Scholar 

  7. Deng J, Li K, Irwin G (2012) Locally regularised two-stage learning algorithm for RBF network centre selection. Int J Syst Sci 43(6):1157–1170

    Article  MathSciNet  Google Scholar 

  8. Efron B, Hastie T, Johnstone I, Tibshirani R (2004) Least angle regression. Ann Stat 32(2):407–499

    Article  MathSciNet  MATH  Google Scholar 

  9. Frank A, Asuncion A (2010) UCI machine learning repository. URL: http://archive.ics.uci.edu/ml10

  10. Furnival G, Wilson R Jr (1974) Regressions by leaps and bounds. Technometrics 16(4):499–511

    Google Scholar 

  11. Gomm J, Yu D (2000) Selecting radial basis function network centers with recursive orthogonal least squares training. IEEE Trans Neural Netw 11(2):306–314

    Article  Google Scholar 

  12. Hong X, Mitchell R, Chen S, Harris C, Li K, Irwin G (2008) Model selection approaches for non-linear system identification: a review. Int J Syst Sci 39(10):925–946

    Article  MathSciNet  MATH  Google Scholar 

  13. Huang G, Song S, Wu C (2012) Orthogonal least squares algorithm for training cascade neural networks. IEEE Trans Circuits Syst I Regul Pap 59(12). doi:10.1109/TCSI.2012.2189060

  14. Li K, Peng J (2006) System oriented neural networks-problem formulation, methodology and application. Int J Pattern Recogn Artif Intell 20:143–158

    Article  Google Scholar 

  15. Li K, Peng J, Bai E (2009) Two-stage mixed discrete–continuous identification of radial basis function (RBF) neural models for nonlinear systems. IEEE Trans Circuits Syst I Regul Pap 56(3):630–643

    Article  MathSciNet  Google Scholar 

  16. Liu Z, Li W, Sun W (2011) A novel method of short-term load forecasting based on multiwavelet transform and multiple neural networks. Neural Comput Appl 1–7. doi:10.1007/s00521-011-0715-2

  17. Park J, Sandberg I (1991) Universal approximation using radial-basis-function networks. Neural Comput 3(2):246–257

    Article  Google Scholar 

  18. Peng J, Li K, Huang D (2006) A hybrid forward algorithm for rbf neural network construction. IEEE Trans Neural Netw 17(6):1439–1451

    Article  Google Scholar 

  19. Qian D, Zhao D, Yi J, Liu X (2011) Neural sliding-mode load frequency controller design of power systems. Neural Comput Appl 1–8. doi:10.1007/s00521-011-0709-0

  20. Rosset S, Zhu J (2007) Piecewise linear regularized solution paths. Ann Stat 35(3):1012–1030

    Article  MathSciNet  MATH  Google Scholar 

  21. Sheikhan M, Pardis R, Gharavian D (2012) State of charge neural computational models for high energy density batteries in electric vehicles. Neural Comput Appl 1–10. doi:10.1007/s00521-012-0883-8

  22. Sherstinsky A, Picard R (1996) On the efficiency of the orthogonal least squares training method for radial basis function networks. IEEE Trans Neural Netw 7(1):195–200

    Article  Google Scholar 

  23. Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc Ser B (Methodol) 58(1):267–288

    Google Scholar 

  24. Tibshirani R (2011) Regression shrinkage and selection via the lasso: a retrospective. J R Stat Soc Ser B (Methodol) 73(3):273–282

    Article  MathSciNet  Google Scholar 

  25. Wang G, Yeung D, Lochovsky F (2007) The kernel path in kernelized lasso. In: International conference on artificial intelligence and statistics, pp 580–587

  26. Zheng G, Billings S (1996) Radial basis function network configuration using mutual information and the orthogonal least squares algorithm. Neural Netw 9(9):1619–1637

    Article  Google Scholar 

  27. Zou H, Hastie T, Tibshirani R (2007) On the degrees of freedom of the lasso. Ann Stat 35(5):2173–2192

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

This paper is supported by the National Natural Science Foundation of China under Grant 61273233, the Research Foundation for the Doctoral Program of Higher Education under Grant 20090002110035, the Project of China Ocean Association under Grant DY125-25-02, and Tsinghua University Initiative Scientific Research Program under Grants 2010THZ07002 and 2011THZ07132.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shiji Song.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Zhou, Q., Song, S., Wu, C. et al. Kernelized LARS–LASSO for constructing radial basis function neural networks. Neural Comput & Applic 23, 1969–1976 (2013). https://doi.org/10.1007/s00521-012-1189-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-012-1189-6

Keywords

Navigation