Skip to main content
Log in

Newton method for implicit Lagrangian twin support vector machines

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

In this paper, we proposed an implicit Lagrangian twin support vector machine (TWSVM) classifiers by formulating a pair of unconstrained minimization problems (UMPs) in dual variables whose solutions will be obtained using finite Newton method. The advantage of considering the generalized Hessian approach for our modified UMPs reduces to solving just two systems of linear equations as opposed to solving two quadratic programming problems in TWSVM and TBSVM, which leads to extremely simple and fast algorithm. Unlike the classical TWSVM and least square TWSVM (LSTWSVM), the structural risk minimization principle is implemented by adding regularization term in the primal problems of our proposed algorithm. This embodies the essence of statistical learning theory. Computational comparisons of our proposed method against GEPSVM, TWSVM, STWSVM and LSTWSVM have been made on both synthetic and well-known real world benchmark datasets. Experimental results show that our method yields significantly better generalization performance in both computational time and classification accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Armijo L (1966) Minimization of functions having Lipschitz-continuous first partial derivatives. Pac J Math 16:1–3

    Article  MathSciNet  MATH  Google Scholar 

  2. Balasundaram S, Tanveer M (2012) On proximal bilateral-weighted fuzzy support vector machine classifiers. Int J Adv Intell Paradig 4(3/4):199–210

    Article  Google Scholar 

  3. Balasundaram S, Tanveer M (2013) On Lagrangian twin support vector regression. Neural Comput Appl 22(1):257–267

    Article  Google Scholar 

  4. Balasundaram S, Tanveer M (2013) Smooth Newton method for implicit Lagrangian twin support vector regression. KES J 17(4):267–278

    Google Scholar 

  5. Brown MPS, Grundy WN, Lin D (2000) Knowledge-based analysis of micro-array gene expression data using support vector machine. Proc Natl Acad Sci USA 97(1):262–267

    Article  Google Scholar 

  6. Cortes C, Vapnik VN (1995) Support vector networks. Mach Learn 20:273–297

    MATH  Google Scholar 

  7. Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

  8. Deng NY, Tian YJ, Zhang CH (2013) Support vector machines: optimization based theory, algorithms, and extensions. CRC Press, Boca Raton

    MATH  Google Scholar 

  9. Duda RO, Hart PR, Stork DG (2001) Pattern classification, 2nd edn. Wiley, New York

    MATH  Google Scholar 

  10. Fung G, Mangasarian OL (2001) Proximal support vector machines. In: Provost F, Srikant R, (eds) Proceedings KDD-2001: knowledge discovery and data mining, August 26–29, 2001, San Francisco, CA, New York, pp 76–86

  11. Fung G, Mangasarian OL (2003) Finite Newton method for Lagrangian support vector machine classification. Neurocomputing 55(1–2):39–55

    Article  Google Scholar 

  12. Gao S, Ye Q, Ye N (2011) 1-Norm least squares twin support vector machines. Neurocomputing 74:3590–3597

    Article  Google Scholar 

  13. Hiriart-Urruty JB, Strodiot JJ, Nguyen VH (1984) Generalized Hessian matrix and second order optimality conditions for problems with CL1 data. Appl Math Optim 11:43–56

    Article  MathSciNet  MATH  Google Scholar 

  14. Jayadeva Khemchandani R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910

    Article  MATH  Google Scholar 

  15. Joachims T (1999) Making large-scale support vector machine learning practical. In: Advances in kernel methods. In: Support vector learning. MIT Press, Cambridge

  16. Joachims T, Ndellec C, Rouveriol C (1998) Text categorization with support vector machines: learning with many relevant features. Eur Conf Mach Learn Chemnitz Ger 10:137–142

    Google Scholar 

  17. Kumar MA, Gopal M (2008) Application of smoothing technique on twin support vector machines. Pattern Recogn Lett 29:1842–1848

    Article  Google Scholar 

  18. Kumar MA, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36:7535–7543

    Article  Google Scholar 

  19. Lee YJ, Mangasarian OL (2001a) RSVM: reduced support vector machines. In: Proceedings of the first SIAM international conference on data mining, pp 5–7

  20. Lee YJ, Mangasarian OL (2001b) SSVM: a Smooth support vector machine for classification. Comput Optim Appl 20(1):5–22

    Article  MathSciNet  MATH  Google Scholar 

  21. Mangasarian OL (1994) Nonlinear programming. SIAM, Philadelphia

    Book  MATH  Google Scholar 

  22. Mangasarian OL (2002) A finite Newton method for classification. Optim Methods Softw 17:913–929

    Article  MathSciNet  MATH  Google Scholar 

  23. Mangasarian OL, Musicant DR (1999) Successive overrelaxation for support vector machines. IEEE Trans Neural Netw 10:1032–1037

    Article  Google Scholar 

  24. Mangasarian OL, Musicant DR (2001) Lagrangian support vector machines. J Mach Learn Res 1:161–177

    MathSciNet  MATH  Google Scholar 

  25. Mangasarian OL, Solodov MV (1993) Nonlinear complementarity as unconstrained and constrained minimization. Math Program Ser B 62:277–297

    Article  MathSciNet  MATH  Google Scholar 

  26. Mangasarian OL, Wild EW (2006) Multisurface proximal support vector classification via generalized eigenvalues. IEEE Trans Pattern Anal Mach Intell 28(1):69–74

    Article  Google Scholar 

  27. Murphy PM, Aha DW (1992) UCI repository of machine learning databases. University of California, Irvine. http://www.ics.uci.edu/ mlearn

  28. Musicant D (1998) NDC: normally distributed clustered datasets. Computer Sciences Department, University of Wisconsin, Madison. www.cs.wisc.edu/dmi/svm/ndc

  29. Nasiri JA, Charkari NM, Jalili S (2014) Least squares twin multi-class classification support vector machine. Pattern Recogn. doi:10.1016/j.patcog.2014.09.020

  30. Osuna E, Freund R, Girosi F (1997) Training support vector machines: an application to face detection. In: Proceedings of computer vision and pattern recognition, pp 130–136

  31. Peng X (2010) TSVR: an efficient twin support vector machine for regression. Neural Netw 23(3):365–372

    Article  Google Scholar 

  32. Peng X, Xu D (2013) Robust minimum class variance twin support vector machine classifier. Neural Comput Appl 22(5):999–1011

    Article  MathSciNet  Google Scholar 

  33. Platt J (1999) Fast training of support vector machines using sequential minimal optimization. In: Scholkopf B, Burges CJC, Smola AJ (eds) Advances in kernel methods-support vector learning. MIT Press, Cambridge, pp 185–208

    Google Scholar 

  34. Qi Z, Tian Y, Shi Y (2012) Laplacian twin support vector machine for semi-supervised classification. Neural Netw 35:46–53

    Article  MATH  Google Scholar 

  35. Ripley BD (2008) Pattern recognition and neural networks. Cambridge University Press, Cambridge

    MATH  Google Scholar 

  36. Scholkopf B, Smola A (2002) Learning with kernels. MIT Press, Cambridge

    MATH  Google Scholar 

  37. Shao Y-H, Zhang CH, Wang XB, Deng NY (2011) Improvements on twin support vector machines. IEEE Trans Neural Netw 22(6):962–968

    Article  Google Scholar 

  38. Shao YH, Deng NY, Yang ZM, Chen WJ, Wang Z (2012) Probabilistic outputs for twin support vector machines. Knowl-Based Syst 33:145–151

    Article  Google Scholar 

  39. Shao YH, Chen WJ, Deng NY (2014) Nonparallel hyperplane support vector machine for binary classification problems. Inf Sci 263:22–35

    Article  MathSciNet  MATH  Google Scholar 

  40. Tanveer M (2014) Application of smoothing techniques for linear programming twin support vector machines. Knowl Inf Syst. doi:10.1007/s10115-014-0786-3

  41. Tanveer M (2015) Robust and sparse linear programming twin support vector machines. Cogn Comput 7(1):137–149

    Article  Google Scholar 

  42. Tian Y, Ping Y (2014) Large-scale linear nonparallel support vector machine solver. Neural Netw 50:166–174

    Article  MATH  Google Scholar 

  43. Tsang IW, Kwok JT, Cheung PM (2005) Core vector machines: fast SVM training on very large datasets. J Mach Learn Res 6:363–392

    MathSciNet  MATH  Google Scholar 

  44. Vapnik VN (2000) The nature of statistical learning theory, 2nd edn. Springer, New York

    Book  MATH  Google Scholar 

  45. Xu Y, Wang L (2012) A weighted twin support vector regression. Knowl-Based Syst 33:92–101

    Article  Google Scholar 

  46. Zhong P, Xu Y, Zhao Y (2012) Training twin support vector regression via linear programming. Neural Comput Appl 21(2):399–407

    Article  Google Scholar 

Download references

Acknowledgments

The author acknowledges the valuable comments of the anonymous reviewers and the Editor of Journal of Machine Learning and Cybernetics whose enthusiasm is gladly appreciated. Also, the author would like to express his sincere gratitude to Prof. Yuh-Jye Lee, NTUST Taiwan and Prof. S. Balasundaram, JNU New Delhi, India for their help during the preparation of this manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to M. Tanveer.

Additional information

This work was carried out at the School of Computer and Systems Sciences, Jawaharlal Nehru University, New Delhi, India, and Department of Computer Science and Engineering, The LNM Institute of Information Technology (LNMIIT) Jaipur, INDIA.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tanveer, M. Newton method for implicit Lagrangian twin support vector machines. Int. J. Mach. Learn. & Cyber. 6, 1029–1040 (2015). https://doi.org/10.1007/s13042-015-0414-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-015-0414-x

Keywords

Navigation