Skip to main content
Log in

Leave-one-out cross-validation-based model selection for multi-input multi-output support vector machine

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

As an effective approach for multi-input multi-output regression estimation problems, a multi-dimensional support vector regression (SVR), named M-SVR, is generally capable of obtaining better predictions than applying a conventional support vector machine (SVM) independently for each output dimension. However, although there are many generalization error bounds for conventional SVMs, all of them cannot be directly applied to M-SVR. In this paper, a new leave-one-out (LOO) error estimate for M-SVR is derived firstly through a virtual LOO cross-validation procedure. This LOO error estimate can be straightway calculated once a training process ended with less computational complexity than traditional LOO method. Based on this LOO estimate, a new model selection methods for M-SVR based on multi-objective optimization strategy is further proposed in this paper. Experiments on toy noisy function regression and practical engineering data set, that is, dynamic load identification on cylinder vibration system, are both conducted, demonstrating comparable results of the proposed method in terms of generalization performance and computational cost.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Vapnik VN (1995) The nature of statistical learning theory. Springer, New York

    Book  MATH  Google Scholar 

  2. Schölkopf B, Smola AJ (2002) Learning with Kernels. MIT Press, Cambridge

    Google Scholar 

  3. Vapnik VN, Golowich S, Smola AJ (1997) Support vector method for function approximation, regression estimation, and signal processing. In: Mozer M, Jordan M, Petsche T (eds) Advances in neural information processing systems, vol 9. MIT Press, Cambridge, pp 281–287

    Google Scholar 

  4. Smola AJ, Schölkopf B (1998) A tutorial on support vector regression. NeuroCOLT Technical Report TR, Royal Holloway College, London

  5. Li JW, Wang YH, Wu Q, Wei YF, An JL (2008) EEG source localization of ERP based on multidimensional support vector regression approach. In: Proceedings of 2008 international conference on machine learning and cybernetics. IEEE Press, Kunming, China, pp 1238–1243

  6. Pérez-Cruz F, Camps-Valls G, Soria-Olivas E et al (2002) Multi-dimensional function approximation and regression estimation. Lect Notes Comput Sci 2415/2002:757–762

    Article  Google Scholar 

  7. Sánchez-fernández M, De-prado-cumplido M, Arenas-garcía J, Pérez-Cruz F (2004) SVM multiregression for nonlinear channel estimation in multiple-input multiple-output systems. IEEE Trans Signal Process 52(8):2298–2307

    Article  MathSciNet  Google Scholar 

  8. Tuia D, Verrelst J, Alonso L, Pérez-Cruz F, Camps-Valls G (2011) Multioutput support vector regression for remote sensing biophysical parameter estimation. IEEE Lett Geosci Rem Sens 8:804–808

    Article  Google Scholar 

  9. Chapelle O, Vapnik V (2000) Model selection for support vector machines. In: Solla SA, Leen TK, Müller KR (eds) Advances in neural information processing systems, vol 12. MIT Press, Cambridge, pp 230–236

    Google Scholar 

  10. Huang CM, Lee YJ, Lin Dennis KJ, Huang SY (2007) Model selection for support vector machines via uniform design. Comput Stat Data An 52(1):335–346

    Article  MATH  Google Scholar 

  11. Kohavi R (1995) A study of cross-validation and bootstrap for accuracy estimation and model selection. In: Proceedings of 14th international conference on artificial intelligence (IJCAI), Morgan Kaufmann, Montral, Canada, August 20–25, pp 1137–1143

  12. Chapelle O, Vapnik VN, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. J Mach Learn Res 46(1):131–159

    MATH  Google Scholar 

  13. Vapnik VN, Chapelle O (2000) Bounds on error expectation for support vector machines. Neural Comput 12(9):2013–2036

    Article  Google Scholar 

  14. Chung KM, Kao WC, Sun CL, Wang LL, Lin CJ (2003) Radius margin bounds for support vector machines with the RBF Kernel. Neural Comput 15(11):2643–2681

    Article  MATH  Google Scholar 

  15. Xu ZB, Dai MW, Meng DY (2009) Fast and efficient strategies for model selection of Gaussian support vector machine. IEEE Trans Syst Man Cybern Part B Cybern 39(5):1292–1307

    Article  Google Scholar 

  16. Chang MW, Lin CJ (2005) Leave-one-out bounds for support vector regression model selection. Neural Comput 17(5):1188–1222

    Article  MATH  MathSciNet  Google Scholar 

  17. Cawley GC, Talbot NLC (2007) Preventing over-fitting in model selection via Bayesian regularisation of the hyper-parameters. J Mach Learn Res 8:841–861

    MATH  Google Scholar 

  18. Li S, Tan M (2010) Tuning SVM parameters by using a hybrid CLPSO-BFGS algorithm. Neurocomputing 73:2089–2096

    Article  Google Scholar 

  19. Avci E (2009) Selecting of the optimal feature subset and kernel parameters in digital modulation classification by using hybrid genetic algorithm-support vector machines: HGASVM. Expert Syst Appl 36:1391–1402

    Article  Google Scholar 

  20. Mao WT, Tian M, Yan GR (2012) Research of load identification based on multiple-input multiple-output SVM model selection. In: Proceedings of the institution of mechanical engineers, Part C: J Mech Eng Sci 226(5):1395–1409

  21. Evgeniou T, Pontil M (2004) Regularized multi-task learning. In: Proceedings of tenth ACM SIGKDD international conference on knowledge discovery and data mining (KDD), Seattle, USA, pp 109–117

  22. Mao WT, Yan GR, Dong LL, Hu DK (2011) Model selection for least squares support vector regressions based on small-world strategy. Expert Syst Appl 38(4):3227–3237

    Article  Google Scholar 

  23. Tikhonov AN (1998) Nonlinear ill-posed problems. Chapman & Hall, London

    MATH  Google Scholar 

  24. Allen DM (1974) The relationship between variable selection and prediction. Technometrics 16:125–127

    Article  MATH  MathSciNet  Google Scholar 

  25. Huang VL, Suganthan PN, Liang JJ (2006) Comprehensive learning particle swarm optimizer for solving multiobjective optimization problems. Int J Intell Syst 21:209–226

    Article  MATH  Google Scholar 

Download references

Acknowledgments

We thank the author Suganthan [25] for providing implementation of MOCLPSO. This work was supported by National Natural Science Foundation of China (No. U1204609) and Foundation and Advanced Technology Research Program of Henan Province, China (No.122300410111).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wentao Mao.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mao, W., Mu, X., Zheng, Y. et al. Leave-one-out cross-validation-based model selection for multi-input multi-output support vector machine. Neural Comput & Applic 24, 441–451 (2014). https://doi.org/10.1007/s00521-012-1234-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-012-1234-5

Keywords

Navigation