Skip to main content
Log in

Incremental learning for Lagrangian ε-twin support vector regression

  • Data Analytics and Machine Learning
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

This paper investigates the online learning problem of Lagrangian \(\varepsilon\)-twin support vector regression (L-\(\varepsilon\)-TSVR), with the goal of presenting incremental implementations. First, to solve the problem that the existing L-\(\varepsilon\)-TSVR cannot efficiently update the model under incremental scenarios, an incremental Lagrangian \(\varepsilon\)-twin support vector regression (IL-\(\varepsilon\)-TSVR) based on the semi-smooth Newton method is proposed. By utilizing the matrix inverse theorems to update the Hessian matrices incrementally, IL-\(\varepsilon\)-TSVR lowers the time complexity and expedites the training process. However, when solving the problem of nonlinear case, the training speed of IL-\(\varepsilon\)-TSVR rapidly decreases with the increasing size of the kernel matrix. Therefore, an incremental reduced Lagrangian \(\varepsilon\)-twin support vector regression (IRL-\(\varepsilon\)-TSVR) is presented. IRL-\(\varepsilon\)-TSVR employs the reduced technique to restrict the size of the inverse matrix at the cost of slightly lower the prediction accuracy. Next, to lighten the prediction accuracy loss caused by parameters reduction, a novel regularization term is introduced to replace the original one, and an improved incremental reduced Lagrangian \(\varepsilon\)-twin support vector regression (IIRL-\(\varepsilon\)-TSVR) is designed. The results on UCI benchmark datasets show that IL-\(\varepsilon\)-TSVR can effectively address the linear regression problem under incremental scenarios and obtain almost the same generalization capability as offline learning. Moreover, IRL-\(\varepsilon\)-TSVR and IIRL-\(\varepsilon\)-TSVR can reduce training time of nonlinear regression model and obtain sparse solution, and their generalization capabilities are close to those of offline ones. Particularly, the proposed algorithms can implement fast incremental learning of large-scale data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Availability of data and material

All benchmark datasets used in this paper can be accessed from the UCI machine learning repository.

References

  • Ahmadi M, Jafarzadeh-Ghoushchi S, Taghizadeh R, Sharifi A (2019) Presentation of a new hybrid approach for forecasting economic growth using artificial intelligence approaches. Neural Comput Appl 31(12):8661–8680

    Google Scholar 

  • Ahmadi M, Taghavirashidizadeh A, Javaheri D, Masoumian A, Jafarzadeh Ghoushchi S, Pourasad Y (2021) DQRE-SCnet: a novel hybrid approach for selecting users in federated learning with deep-Q-reinforcement learning based on spectral clustering. J King Saud Univ-Comput Inf Sci (in press)

  • Brugger D, Rosenstiel W, Bogdan M (2011) Online SVR training by solving the primal optimization problem. J Signal Process Syst 65:391–402

    Google Scholar 

  • Burges CJC (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Disc 2(2):121–167

    Google Scholar 

  • Cao J, Gu BJ, Xiong WL, Pan F (2021) Incremental reduced least squares twin support vector regression. J Front Comput Sci Technol 15(3):553–563

    Google Scholar 

  • Cao J, Gu BJ, Pan F, Xiong WL (2022) Accurate incremental ε-twin support vector regression. Control Theory Appl 39(6):1020–1032

    Google Scholar 

  • Cauwenberghs G, Poggio T (2001) Incremental and decremental support vector machine learning. In: International conference on neural information processing systems. MIT Press

  • Chang CC, Lin CJ (2001) Training ν-support vector classifiers: theory and algorithms. Neural Comput 13(9):2119–2147

    MATH  Google Scholar 

  • Chen YT, Xiong J, Xu WH, Zuo JW (2019) A novel online incremental and decremental learning algorithm based on variable support vector machine. Clust Comput 22(3):7435–7445

    Google Scholar 

  • Cortes C, Vapnik V (1995) Support vector networks. Mach Learn 20(3):273–297

    MATH  Google Scholar 

  • Crammer K, Dekel O, Keshet J, Shalev-Shwartz S, Singer Y (2006) Online passive-aggressive algorithms. J Mach Learn Res 7(3):551–558

    MathSciNet  MATH  Google Scholar 

  • Cristianini N, Shawe-Talyor J (2000) An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, Cambridge

    MATH  Google Scholar 

  • Ding SF, Huang HJ (2017) Least squares twin parametric insensitive support vector regression. J Softw 28(12):3146–3155

    MATH  Google Scholar 

  • Gama J, Žliobaitė I, Bifet A, Pechenizkiy M, Bouchachia A (2014) A survey on concept drift adaptation. ACM Comput Surv 46(4):44

    MATH  Google Scholar 

  • Gan L, Yang M (2019) Pedestrian detection method based on ensemble SVM classifier. Comput Eng Appl 55(7):194–198

    Google Scholar 

  • Golub GH, Van Loan CF (1996) Matrix computations, 3rd edn. John Hopkins University Press, Baltimore

    MATH  Google Scholar 

  • Gu BJ, Pan F (2016) Accurate incremental online ν-support vector regression learning algorithm. Control Theory Appl 33(4):466–478

    MATH  Google Scholar 

  • Gu B, Sheng VS (2013) Feasibility and finite convergence analysis for accurate on-line ν-support vector machine. IEEE Trans Neural Netw Learn Syst 24(8):1304–1315

    Google Scholar 

  • Gu B, Wang JD, Yu YC, Zheng GS, Huang YF et al (2012) Accurate on-line ν-support vector learning. Neural Netw 27:51–59

    MATH  Google Scholar 

  • Gu B, Sheng VS, Wang Z, Ho D, Osman S et al (2015) Incremental learning for ν-support vector regression. Neural Netw 67:140–150

    MATH  Google Scholar 

  • Gu BJ, Fang JW, Pan F, Bai ZH (2020) Fast clustering-based weighted twin support vector regression. Soft Comput 24(8):6101–6117

    MATH  Google Scholar 

  • Hao YH, Zhang HF (2016) Incremental learning algorithm based on twin support vector regression. Comput Sci 43(2):230–239

    MathSciNet  Google Scholar 

  • Hu ZH, Xu YW, Zhao XL, He J, Zhou Y (2015) Multi-feature selection tracking based on support vector machine. J Appl Sci 33(5):502–517

    Google Scholar 

  • Hua XP, Xu S, Gao J, Ding SF (2019) L1-norm loss-based projection twin support vector machine for binary classification. Soft Comput 23(21):10649–10659

    MATH  Google Scholar 

  • Huang HJ, Ding SF, Shi ZZ (2013) Primal least squares twin support vector regression. J Zhejiang Univ-Sci C-Comput Electron 14(9):722–732

    Google Scholar 

  • Jayadeva, Khemchandani R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910

    MATH  Google Scholar 

  • Karasuyama M, Takeuchi I (2010) Multiple incremental decremental learning of support vector machines. IEEE Trans Neural Netw 21(7):1048–1059

    Google Scholar 

  • Kumar MA, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36(4):7535–7543

    Google Scholar 

  • Laskov P, Gehl C, Krüger S, Müller KR (2006) Incremental support vector learning: analysis, implementation and application. J Mach Learn Res 7:1909–1936

    MathSciNet  MATH  Google Scholar 

  • Le T, Nguyen TD, Nguyen V, Phung D (2017) Approximation vector machines for large-scale online learning. J Mach Learn Res 18(111):1–55

    MathSciNet  MATH  Google Scholar 

  • Lilleberg J, Zhu Y, Zhang Y (2015) Support vector machines and word2vec for text classification with semantic features. In: International conference on cognitive informatics and cognitive computing. IEEE

  • Lu J, Steven CHH, Wang JL, Zhao PL, Liu ZY (2016) Large scale online kernel learning. J Mach Learn Res 17:1–43

    MathSciNet  MATH  Google Scholar 

  • Ma JS, Theiler J, Perkins S (2003) Accurate on-line support vector regression. Neural Comput 15(11):2683–2703

    MATH  Google Scholar 

  • Melki G, Kecman V, Ventura S, Cano A (2018) OLLAWV: online learning algorithm using worst-violators. Appl Soft Comput 66:384–393

    Google Scholar 

  • Pang XY, Xu YT (2019) A safe screening rule for accelerating weighted twin support vector machine. Soft Comput 23(17):7725–7739

    MATH  Google Scholar 

  • Peng XJ (2010) TSVR: an efficient twin support vector machine for regression. Neural Netw 23(3):365–372

    MATH  Google Scholar 

  • Qi HD, Sun DF (2006) A quadratically convergent newton method for computing the nearest correlation matrix. SIAM J Matrix Anal Appl 28(2):360–385

    MathSciNet  MATH  Google Scholar 

  • Rastogi R, Anand P, Chandra S (2017) A ν-twin support vector machine based on regression with automatic accuracy control. Appl Intell 46(3):670–683

    Google Scholar 

  • Ruan JH, Shi Y, Yang J (2011) Forest fires burned area prediction based on support vector machines with feature selection. ICIC Express Lett 5(8):2597–2603

    Google Scholar 

  • Shao YH, Zhang CH, Yang ZM, Ling J, Deng NY (2013) An ε-twin support vector machine for regression. Neural Comput Appl 23(1):175–185

    Google Scholar 

  • Singh M, Chadha J, Ahuja P, Jayadeva CS (2011) Reduced twin support vector regression. Neurocomputing 74(9):1474–1477

    Google Scholar 

  • Tanveer M, Shubham K (2017) A regularization on Lagrangian twin support vector regression. Int J Mach Learn Cybern 8(3):807–821

    Google Scholar 

  • Tanveer M, Shubham K, Aldhaifallah M, Nisar KS (2016) An efficient implicit regularized Lagrangian twin support vector regression. Appl Intell 44(4):831–848

    Google Scholar 

  • Tanveer M, Tiwari A, Choudhary R, Jalan S (2019) Sparse pinball twin support vector machines. Appl Soft Comput J 78:164–175

    Google Scholar 

  • Vapnik VN (1999) An overview of statistical learning theory. IEEE Trans Neural Netw 10(5):988–999

    Google Scholar 

  • Wang D, Qiao H, Zhang B, Wang M (2013) Online support vector machine based on convex hull vertices selection. IEEE Trans Neural Netw Learn Syst 24(4):593–609

    Google Scholar 

  • Wang LD, Gao C, Zhao NN, Chen XB (2019) A projection wavelet weighted twin support vector regression and its primal solution. Appl Intell 49(8):3061–3081

    Google Scholar 

  • Yin J, Li Q (2019) A semismooth Newton method for support vector classification and regression. Comput Optim Appl 73(2):477–508

    MathSciNet  MATH  Google Scholar 

  • Zhang HR, Wang XD (2006) Incremental and online learning algorithm for regression least squares support vector machine. Chin J Comput 29(3):400–406

    Google Scholar 

  • Zhang ZQ, Lv TL, Wang H, Liu LM, Tan JY (2018) A novel least square twin support vector regression. Neural Process Lett 48(2):1187–1200

    Google Scholar 

  • Zhao YP, Sun JG, Du ZH, Zhang ZA, Li YB (2012) Online independent reduced least squares support vector regression. Inf Sci 201:37–52

    MATH  Google Scholar 

  • Zheng J, Shen FR, Fan HJ, Zhao JX (2013) An incremental learning support vector machine for large-scale data. Neural Comput Appl 22(5):1023–1035

    Google Scholar 

Download references

Funding

This research was funded by the National Natural Science Foundation of China under Grant No. 61773182.

Author information

Authors and Affiliations

Authors

Contributions

BG done conceptualization and methodology. JC performed writing original draft preparation, investigation, and software. FP done validation and supervision. WX done funding acquisition. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Binjie Gu.

Ethics declarations

Conflict of interest

All authors declare that there is no conflict of interests.

Ethical approval

This paper has not been previously published elsewhere, and it is not currently being considered for publication elsewhere.

Informed consent

Not applicable.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gu, B., Cao, J., Pan, F. et al. Incremental learning for Lagrangian ε-twin support vector regression. Soft Comput 27, 5357–5375 (2023). https://doi.org/10.1007/s00500-022-07755-9

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-022-07755-9

Keywords

Navigation