Skip to main content

L1-norm loss-based projection twin support vector machine for binary classification

Abstract

This paper presents a L1-norm loss-based projection twin support vector machine (L1LPTSVM) for binary classification. In the pair of optimization problems of L1LPTSVM, L1-norm-based losses are considered for two classes, which leads to two different dual problems with projection twin support vector machine (PTSVM). Compared with PTSVM, L1LPTSVM has two main advantages: first, the dual problems of L1LPTSVM avoid the complex calculation of inverse matrices in the training process, indicating that L1LPTSVM can be solved efficiently using some SVM-type training algorithms. Second, similar to the traditional SVM, L1LPTSVM has an unified form in the linear and nonlinear cases. In addition, a density-dependent quantization scheme for sparse representation is used as the data preprocessing unit attached to L1LPTSVM, which makes L1LPTSVM be more suitable for large-scale problems. Extensive experimental results on several artificial and benchmark data sets show the effectiveness of the proposed method.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2

References

  • Burges C (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Discov 2:121–167

    Article  Google Scholar 

  • Cevikalp H, Franc V (2017) Large-scale robust transductive support vector machines. Neurocomputing 235:199–209

    Article  Google Scholar 

  • Chen X, Yang J, Ye Q, Liang J (2011) Recursive projection twin support vector machine via within-class variance minimization. Pattern Recognit 44:2643–2655

    Article  Google Scholar 

  • Chen S, Wu X, Yin H (2017) A novel projection twin support vector machine for binary classification. Soft Comput 10:10. https://doi.org/10.1007/s00500-017-2974-z

    Article  Google Scholar 

  • Ding S, Hua X (2014) Recursive least squares projection twin support vector machines. Neurocomputing 130:3–9

    Article  Google Scholar 

  • Ding S, Zhang X, An Y, Xue Y (2017a) Weighted linear loss multiple birth support vector machine based on information granulation for multi-classification. Pattern Recognit 67:32–46

    Article  Google Scholar 

  • Ding S, An Y, Zhang X, Wu F, Xue Y (2017b) Wavelet twin support vector machines based on glowworm swarm optimization. Neurocomputing 225:157–163

    Article  Google Scholar 

  • Gu Z, Zhang Z, Sun J, Li B (2017) Robust image recognition by L1-norm twin-projection support vector machine. Neurocomputing 223:1–11

    Article  Google Scholar 

  • Hua X, Ding S (2015) Weighted least squares projection twin support vector machines with local information. Neurocomputing 160:228–237

    Article  Google Scholar 

  • Jayadeva R, Khemchandai R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal 29:905–910

    Article  Google Scholar 

  • Kumar MA, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36:7535–7543

    Article  Google Scholar 

  • Mangasarian OL (1994) Nonlinear programming. SIAM, Philadelphia

    Book  Google Scholar 

  • Mangasarian OL, Musicant DR (1999) Successive overrelaxation for support vector machines. IEEE Trans Neural Netw 10:1032–1037

    Article  Google Scholar 

  • Nan S, Sun L, Chen B, Lin Z, Toh K (2017) Density-dependent quantized least squares support vector machine for large data sets. IEEE Trans Neural Netw Learn Syst 28:94–105

    Article  Google Scholar 

  • Nedaie A, Najafi AA (2016) Polar support vector machine: Single and multiple outputs. Neurocomputing 171:118–126

    Article  Google Scholar 

  • Peng X, Chen D (2018) PTSVRs: regression models via projection twin support vector machine. Inf Sci 435:1–14

    MathSciNet  Article  Google Scholar 

  • Peng X, Shen J (2017) A twin-hyperspheres support vector machine with automatic variable weights for data classification. Inf Sci 417:216–235

    MathSciNet  Article  Google Scholar 

  • Peng X, Xu D, Kong L, Chen D (2016) L1-norm loss based twin support vector machine for data recognition. Inf Sci 340–341:86–103

    Article  Google Scholar 

  • Peng X, Rafferty K, Ferguson S (2017) A fast algorithm for sparse support vector machines for mobile computing applications. Neurocomputing 230:160–172

    Article  Google Scholar 

  • Shao Y, Deng N, Yang Z (2012) Least squares recursive projection twin support vector machine for classification. Pattern Recognit 45:2299–2307

    Article  Google Scholar 

  • Shao Y, Wang Z, Chen W, Deng N (2013) A regularization for the projection twin support vector machines. Knowl Based Syst 37:203–210

    Article  Google Scholar 

  • Shao YH, Chen WJ, Deng NY (2014) Nonparallel hyperplane support vector machine for binary classification problems. Inf Sci 263:22–35

    MathSciNet  Article  Google Scholar 

  • Tuan Y, Qi Z, Ju X, Shi Y, Liu X (2014) Nonparallel support vector machines for pattern classification. IEEE Trans Cybern 44:1067–1079

    Article  Google Scholar 

  • Vapnik VN (1995) The natural of statistical learning theory. Springer, New York

    Book  Google Scholar 

  • Vapnik VN (1998) Statistical learning theory. Wiley, New York

    MATH  Google Scholar 

  • Wang H, Zhou Z (2017) An improved rough margin-based v-twin bounded support vector machine. Knowl Based Syst 128:125–138

    Article  Google Scholar 

  • Xie X, Sun S (2017) PAC-Bayes bounds for twin support vector machines. Neurocomputing 234:137–143

    Article  Google Scholar 

  • Xu Y, Yang Z, Pan X (2017) A novel twin support-vector machine with pinball loss. IEEE Trans Neural Netw Learn Syst 2:359–370

    MathSciNet  Article  Google Scholar 

  • Yan H, Ye Q, Zhang T, Yu DJ, Yuan X, Xu Y, Fu L (2018) Least squares twin bounded support vector machines based on L1-norm distance metric for classification. Pattern Recognit 74:434–447

    Article  Google Scholar 

  • Ye Q, Zhao C, Ye N, Chen Y (2010) Multi-weight vector projection support vector machines. Pattern Recognit Lett 31:2006–2011

    Article  Google Scholar 

  • Yi P, Song A, Guo J, Wang R (2017) Regularization feature selection projection twin support vector machine via exterior penalty. Neural Comput Appl 28:683–697

    Article  Google Scholar 

Download references

Acknowledgements

The authors are very grateful to the National Natural Science Foundation of China (No. 61379101), the Jiangsu Natural Science Foundation of China (No. BK20151299) and the Jiangsu Provincial Science and Technology Support Project of China (Nos. BY2016065-01, BY2016065-05) for support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaopeng Hua.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Communicated by A. Di Nola.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Hua, X., Xu, S., Gao, J. et al. L1-norm loss-based projection twin support vector machine for binary classification. Soft Comput 23, 10649–10659 (2019). https://doi.org/10.1007/s00500-019-04002-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-019-04002-6

Keywords

  • Projection twin support vector
  • L1-norm loss
  • Inverse matrix
  • Density dependent quantization scheme
  • Large-scale data sets