Skip to main content
Log in

Lagrangian twin parametric insensitive support vector regression (LTPISVR)

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In this paper, motivated by the works on twin parametric insensitive support vector regression (TPISVR) (Peng in Neurocomputing 79(1):26–38, 2012), and Lagrangian twin support vector regression (Balasundaram and Tanveer in Neural Comput Appl 22(1):257–267, 2013), a new efficient approach is proposed as Lagrangian twin parametric insensitive support vector regression (LTPISVR). In order to make the objective function strongly convex, we consider square of 2-norm of slack variables in the optimization problem. To reduce the computation cost, the solution of proposed LTPISVR is obtained by solving simple linearly convergent iterative schemes, instead of quadratic programming problems as in TPISVR. There is no requirement of any optimization toolbox for proposed LTPISVR. To demonstrate the effectiveness of proposed method, we present numerical results on well-known synthetic and real-world datasets. The results clearly show similar or better generalization performance of proposed method with lesser training time in comparison with support vector regression, twin support vector regression and twin parametric insensitive support vector regression.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Balasundaram S, Tanveer M (2013) On Lagrangian twin support vector regression. Neural Comput Appl 22(1):257–267

    Article  Google Scholar 

  2. Bao Y-K, Liu Z-T, Guo L, Wang W (2005) Forecasting stock composite index by fuzzy support vector machines regression. In: Proceeding of international conference on machine learning and cybernetics, vol 6, pp 3535–3540

  3. Box GEP, Jenkins GM (1976) Time series analysis: forecasting and control. Holden-Day, San Francisco

    MATH  Google Scholar 

  4. Brockwell PJ, Davis RA (2002) Introduction to time series forecasting, 2nd edn. Springer, Berlin

    Book  MATH  Google Scholar 

  5. Casdagli M (1989) Nonlinear prediction of chaotic time series. Physica 35D:335–356

    MathSciNet  MATH  Google Scholar 

  6. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    MATH  Google Scholar 

  7. Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other kernel based learning methods. Cambridge University Press, Cambridge

    Book  MATH  Google Scholar 

  8. DELVE (2005) Data for evaluating learning in valid experiments. http://www.cs.toronto.edu/~delve/data. Accessed 1 Feb 2017

  9. Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

  10. Garcia S, Herrera F (2008) An extension on statistical comparisons of classifiers over multiple data sets for all pairwise comparisons. J Mach Learn Res 9:2677–2694

    MATH  Google Scholar 

  11. Gretton A, Doucet A, Herbrich R, Rayner PJW, Scholkopf B (2001) Support vector regression for black-box system identification. In: Proceedings of the 11th IEEE workshop on statistical signal processing

  12. Gupta D (2017) Training primal K-nearest neighbor based weighted twin support vector regression via unconstrained convex minimization. Appl Intell. https://doi.org/10.1007/s10489-017-0913-4

    Article  Google Scholar 

  13. Gupta D, Richhariya B, Borah P (2018) A fuzzy twin support vector machine based on information entropy for class imbalance learning. Neural Comput Appl. https://doi.org/10.1007/s00521-018-3551-9

    Article  Google Scholar 

  14. Hao PY, Tsai LB, Lin MS (2008) A new Support vector classification algorithm with parametric-margin model. In: IEEE international joint conference, IJCNN, neural networks, pp 420–425

  15. Jayadeva, Khemchandani R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910

    Article  MATH  Google Scholar 

  16. Khemchandani R, Sharma S (2016) Robust least square twin support vector machine for human activity recognition. Appl Soft Comput 47:33–46

    Article  Google Scholar 

  17. Kim SK, Park YJ, Toh KA, Lee S (2010) SVM-based feature extraction for face recognition. Pattern Recognit 43(8):2871–2881

    Article  MATH  Google Scholar 

  18. Malhotra R, Malhotra DK (2003) Evaluating consumer loans using neural networks. Omega 31:83–96

    Article  Google Scholar 

  19. Mangasarian OL (1994) Nonlinear programming. SIAM, Philadelphia

    Book  MATH  Google Scholar 

  20. Mangasarian OL, Musicant DR (2001) Lagrangian support vector machines. J Mach Learn Res 1:161–177

    MathSciNet  MATH  Google Scholar 

  21. Mukherjee S, Osuna E, Girosi F (1997) Nonlinear prediction of chaotic time series using support vector machines. In: NNSP’97: neural networks for signal processing VII: proceedings of IEEE signal processing society workshop, Amelia Island, FL, USA, pp 511–520

  22. Murphy PM, Aha DW (1992) UCI repository of machine learning databases. University of California, Irvine. http://www.ics.uci.edu/~mlearn. Accessed 1 Feb 2017

  23. Peng X (2010) TSVR: an efficient twin support vector machine for regression. Neural Netw 23(3):365–372

    Article  MATH  Google Scholar 

  24. Peng X (2012) Efficient twin parametric insensitive support vector regression model. Neurocomputing 79(1):26–38

    Article  Google Scholar 

  25. Richhariya B, Tanveer M (2018) EEG signal classification using universum support vector machine. Expert Syst Appl 106:169–182

    Article  Google Scholar 

  26. Rios EG, Hernandez EE, Miyatake MN, Meana HP (2017) Face recognition with occlusion using a wireframe model and support vector machine. IEEE Latin Am Trans 15(10):1960–1966

    Article  Google Scholar 

  27. Scholkopf B, Smola AJ, Williamson RC, Bartlett PL (2000) New support vector algorithms. Neural Comput 12(5):1207–1245

    Article  Google Scholar 

  28. Shao Y, Chen W, Zhang J, Wang Z, Deng N (2014) An efficient weighted Lagrangian twin support vector machine for imbalanced data classification. Pattern Recognit 47(6):3158–3167

    Article  MATH  Google Scholar 

  29. Sjoberg J, Zhang Q, Ljung L, Berveniste A, Delyon B, Glorennec P, Hjalmarsson H, Juditsky A (1995) Nonlinear black-box modeling in system identification: a unified overview. Automatica 31:1691–1724

    Article  MathSciNet  MATH  Google Scholar 

  30. Vapnik VN (2000) The nature of statistical learning theory, 2nd edn. Springer, New York

    Book  MATH  Google Scholar 

  31. Xu Y, Wang L (2014) K-nearest neighbor-based weighted twin support vector regression. Appl Intell 41(1):92–101

    MathSciNet  Google Scholar 

  32. Xu Y, Yu J, Zhang Y (2014) KNN-based weighted rough υ–twin support vector machine. Knowl Based Syst 71:303–313

    Article  Google Scholar 

  33. Ye Q, Zhao C, Gao S, Zhang H (2012) Weighted twin support vector machines with local information and its application. Neural Netw 35:31–39

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Deepak Gupta.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gupta, D., Acharjee, K. & Richhariya, B. Lagrangian twin parametric insensitive support vector regression (LTPISVR). Neural Comput & Applic 32, 5989–6007 (2020). https://doi.org/10.1007/s00521-019-04084-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-019-04084-1

Keywords

Navigation