Skip to main content
Log in

A robust algorithm of support vector regression with a trimmed Huber loss function in the primal

  • Foundations
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

Support vector machine for regression (SVR) is an efficient tool for solving function estimation problem. However, it is sensitive to outliers due to its unbounded loss function. In order to reduce the effect of outliers, we propose a robust SVR with a trimmed Huber loss function (SVRT) in this paper. Synthetic and benchmark datasets were, respectively, employed to comparatively assess the performance of SVRT, and its results were compared with those of SVR, least squares SVR (LS-SVR) and a weighted LS-SVR. The numerical test shows that when training samples are subject to errors with a normal distribution, SVRT is slightly less accurate than SVR and LS-SVR, yet more accurate than the weighted LS-SVR. However, when training samples are contaminated by outliers, SVRT has a better performance than the other methods. Furthermore, SVRT is faster than the weighted LS-SVR. Simulating eight benchmark datasets shows that SVRT is averagely more accurate than the other methods when sample points are contaminated by outliers. In conclusion, SVRT can be considered as an alternative robust method for simulating contaminated sample points.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Burges CC (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Discov 2(2):121–167

    Article  Google Scholar 

  • Chapelle O (2007) Training a support vector machine in the primal. Neural Comput 19(5):1155–1178

    Article  MathSciNet  MATH  Google Scholar 

  • Chen X, Yang J, Liang J, Ye Q (2012) Recursive robust least squares support vector regression based on maximum correntropy criterion. Neurocomputing 97:63–73

    Article  Google Scholar 

  • Cherkassky V, Ma Y (2004) Practical selection of SVM parameters and noise estimation for SVM regression. Neural Netw 17(1):113–126

    Article  MATH  Google Scholar 

  • Chuang C-C, Lee Z-J (2011) Hybrid robust support vector machines for regression with outliers. Appl Soft Comput 11(1):64–72

    Article  Google Scholar 

  • Chuang C-C, Su S-F, Jeng J-T, Hsiao C-C (2002) Robust support vector regression networks for function approximation with outliers. IEEE Trans Neural Netw 13(6):1322–1330

    Article  Google Scholar 

  • Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    MATH  Google Scholar 

  • Cui W, Yan X (2009) Adaptive weighted least square support vector machine regression integrated with outlier detection and its application in QSAR. Chemometr Intell Lab Syst 98(2):130–135

    Article  Google Scholar 

  • Jeng J-T, Chuang C-C, Tao C-W (2010) Hybrid SVMR-GPR for modeling of chaotic time series systems with noise and outliers. Neurocomputing 73(10–12):1686–1693

    Article  Google Scholar 

  • Liano K (1996) Robust error measure for supervised neural network learning with outliers. IEEE Trans Neural Netw 7(1):246–250

    Article  Google Scholar 

  • Mountrakis G, Im J, Ogole C (2011) Support vector machines in remote sensing: a review. ISPRS J Photogramm Remote Sens 66(3):247–259

    Article  Google Scholar 

  • Rousseeuw P, Leroy A (2003) Robust regression and outlier detection. Wiley, New York

    MATH  Google Scholar 

  • Rousseeuw PJ, Van Driessen K (2006) Computing LTS regression for large data sets. Data Min Knowl Discov 12(1):29–45

    Article  MathSciNet  MATH  Google Scholar 

  • Shin J, Jin Kim H, Kim Y (2011) Adaptive support vector regression for UAV flight control. Neural Netw 24(1):109–120

    Article  MATH  Google Scholar 

  • Smola A, Schölkopf B (2004) A tutorial on support vector regression. Stat Comput 14(3):199–222

    Article  MathSciNet  Google Scholar 

  • Suykens JAK, De Brabanter J, Lukas L, Vandewalle J (2002a) Weighted least squares support vector machines: robustness and sparse approximation. Neurocomputing 48(1):85–105

    Article  MATH  Google Scholar 

  • Suykens JAK, Gestel TV, Brabanter JD, Moor BD, Vandewalle J (2002b) Least squares support vector machines. World Scientific, Singapore, p 294

    Book  MATH  Google Scholar 

  • Vapnik V (1995) The nature of statistical learning theory. Springer, New York

    Book  MATH  Google Scholar 

  • Vapnik V, Vapnik V (1998) Statistical Learning Theory, vol 2. Wiley, New York

    MATH  Google Scholar 

  • Wang L, Jia H, Li J (2008) Training robust support vector machine with smooth Ramp loss in the primal space. Neurocomputing 71(13–15):3020–3025

    Article  Google Scholar 

  • Wen W, Hao Z, Yang X (2010) Robust least squares support vector machine based on recursive outlier elimination. Soft Comput 14(11):1241–1251

    Article  Google Scholar 

  • Yang X, Tan L, He L (2014) A robust least squares support vector machine for regression and classification with noise. Neurocomputing 140:41–52

    Article  Google Scholar 

  • Yuille AL, Rangarajan A (2003) The concave-convex procedure. Neural Comput 15(4):915–936

    Article  MATH  Google Scholar 

  • Zhao Y, Sun J (2008) Robust support vector regression in the primal. Neural Netw 21(10):1548–1555

    Article  MATH  Google Scholar 

  • Zhao Y, Sun J (2010) Robust truncated support vector regression. Expert Syst Appl 37(7):5126–5133

    Article  Google Scholar 

  • Zhong P (2012) Training robust support vector regression with smooth non-convex loss function. Optim Methods Softw 27(6):1039–1058

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

This work is funded by National Natural Science Foundation of China (Grant Nos. 41371367, 41101433), by SDUST Research Fund, by Joint Innovative Center for Safe And Effective Mining Technology and Equipment of Coal Resources, Shandong Province and by Special Project Fund of Taishan Scholars of Shandong Province.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chuanfa Chen.

Ethics declarations

Conflict of interest

All authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Communicated by A. Di Nola.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, C., Yan, C., Zhao, N. et al. A robust algorithm of support vector regression with a trimmed Huber loss function in the primal. Soft Comput 21, 5235–5243 (2017). https://doi.org/10.1007/s00500-016-2229-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-016-2229-4

Keywords

Navigation