Skip to main content
Log in

Efficient sparse nonparallel support vector machines for classification

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In this paper, we propose a novel nonparallel classifier, named sparse nonparallel support vector machine (SNSVM), for binary classification. Different with the existing nonparallel classifiers, such as the twin support vector machines (TWSVMs), SNSVM has several advantages: It constructs two convex quadratic programming problems for both linear and nonlinear cases, which can be solved efficiently by successive overrelaxation technique; it does not need to compute the inverse matrices any more before training; it has the similar sparseness with standard SVMs; it degenerates to the TWSVMs when the parameters are appropriately chosen. Therefore, SNSVM is certainly superior to them theoretically. Experimental results on lots of data sets show the effectiveness of our method in both sparseness and classification accuracy and, therefore, confirm the above conclusions further.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Cortes C, Vapnik VN (1995) Support-vector networks. Mach Learn 20(3):273–297

    MATH  Google Scholar 

  2. Vapnik VN (1996) The nature of statistical learning theory. Springer, New York

    Google Scholar 

  3. Vapnik VN (1998) Statistical learning theory. Wiley, New York

    MATH  Google Scholar 

  4. Burges C (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Discov 2:121–167

    Article  Google Scholar 

  5. Trafalis TB, Ince H (2000) Support vector machine for regression and applications to financial forecasting. Proc IEEE-INNSENNS Int Joint Conf Neural Netw 6:348–353

    Article  Google Scholar 

  6. Li S, Kwok JT, Zhu H, Wang Y (2003) Texture classification using the support vector machines. Pattern Recognit 36(12):2883–2893

    Article  MATH  Google Scholar 

  7. Wu YC, Lee YS, Yang JC (2008) Robust and efficient multiclass svm models for phrase pattern recognition. Pattern Recognit 41(9):2874–2889

    Article  MATH  Google Scholar 

  8. Isa D, Lee LH, Kallimani VP, RajKumar R (2008) Text document preprocessing with the bayes formula for classification using the support vector machine. IEEE Trans Knowl Data Eng 20(9):1264–1272

    Article  Google Scholar 

  9. Karsten MB (2011) Kernel methods in bioinformatics. Handb Stat Bioinform Part 3:317–334

    Google Scholar 

  10. Wang XY, Wang T, Bu J (2011) Color image segmentation using pixel wise support vector machine classification. Pattern Recognit 44(4):777–787

    Article  MATH  Google Scholar 

  11. Khan N, Ksantini R, Ahmad I, Boufama B (2012) A novel svm+nda model for classification with an application to face recognition. Pattern Recognit 45(1):66–79

    Article  MATH  Google Scholar 

  12. Mangasarian OL, Wild EW (2006) Multisurface proximal support vector classification via generalized eigenvalues. IEEE Trans Pattern Anal Mach Intell 28(1):69–74

    Article  Google Scholar 

  13. Jayadeva RK, Khemchandani R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910

    Article  Google Scholar 

  14. Kumar MA, Gopal M (2008) Application of smoothing technique on twin support vector machines. Pattern Recognit Lett 29(13):1842–1848

    Article  Google Scholar 

  15. Khemchandani R, Jayadeva RK, Chandra S (2009) Optimal kernel selection in twin support vector machines. Optim Lett 3(1):77–88

    Article  MATH  MathSciNet  Google Scholar 

  16. Kumar MA, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36(4):7535–7543

    Article  Google Scholar 

  17. Shao YH, Zhang CH, Wang XB, Deng NY (2011) Improvements on twin support vector machines. IEEE Trans Neural Netw 22(6):962–968

    Article  Google Scholar 

  18. Peng XJ (2010) Tsvr: An efficient twin support vector machine for regression. Neural Netw 23(3):365–372

    Article  Google Scholar 

  19. Chen X, Yang J, Ye Q, Liang J (2011) Recursive projection twin support vector machine via within-class variance minimization. Pattern Recognit. doi:10.1016/j.patcog.2011.03.001

  20. Mangasarian OL (1994) Nonlinear programming. SIAM, Philadelphia

    Book  MATH  Google Scholar 

  21. Mangasarian OL, Musicant DR (1999) Successive overrelaxation for support vector machines. IEEE Trans Neural Netw 10(5):1032–1037

    Article  Google Scholar 

  22. Platt J (2000) Fast training of support vector machines using sequential minimal optimization. In: Schölkopf B, Burges CJC, Smola AJ (eds) Advances in Kernel methods—support vector learning. MIT Press, Cambridge

    Google Scholar 

  23. Blake CL, Merz CJ (1998) Uci repository for machine learning databases. Department of Information and Computer Science, University of California. Irvine [Online], Available: http://archive.ics.uci.edu/ml/datasets.html

  24. Joachims T (2000) Estimating the generalization performance of an svm efficiently. In: Proceedings of international conference on machine learning, Morgan Kaufmann, San Franscisco, pp 431–438

  25. Vapnik VN, Chapelle O (2000) Bounds on error expectation for svm. In: Advances in large-margin classifiers, neural information processing, MIT press, Cambridge, pp 261–280

  26. Adankon MM, Cheriet M, Biem A (2009) Semisupervised least squares support vector machine. IEEE Trans Neural Netw 20(12):1858–1870

    Article  Google Scholar 

Download references

Acknowledgments

This work has been partially supported by grants from National Natural Science Foundation of China (No. 11271361, No. 70921061), the CAS/SAFEA International Partnership Program for Creative Research Teams, Major International (Ragional) Joint Research Project (No. 71110107026), the President Fund of GUCAS.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yingjie Tian.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Tian, Y., Ju, X. & Qi, Z. Efficient sparse nonparallel support vector machines for classification. Neural Comput & Applic 24, 1089–1099 (2014). https://doi.org/10.1007/s00521-012-1331-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-012-1331-5

Keywords

Navigation