Abstract
In this paper, we proposed an implicit Lagrangian twin support vector machine (TWSVM) classifiers by formulating a pair of unconstrained minimization problems (UMPs) in dual variables whose solutions will be obtained using finite Newton method. The advantage of considering the generalized Hessian approach for our modified UMPs reduces to solving just two systems of linear equations as opposed to solving two quadratic programming problems in TWSVM and TBSVM, which leads to extremely simple and fast algorithm. Unlike the classical TWSVM and least square TWSVM (LSTWSVM), the structural risk minimization principle is implemented by adding regularization term in the primal problems of our proposed algorithm. This embodies the essence of statistical learning theory. Computational comparisons of our proposed method against GEPSVM, TWSVM, STWSVM and LSTWSVM have been made on both synthetic and well-known real world benchmark datasets. Experimental results show that our method yields significantly better generalization performance in both computational time and classification accuracy.
Similar content being viewed by others
References
Armijo L (1966) Minimization of functions having Lipschitz-continuous first partial derivatives. Pac J Math 16:1–3
Balasundaram S, Tanveer M (2012) On proximal bilateral-weighted fuzzy support vector machine classifiers. Int J Adv Intell Paradig 4(3/4):199–210
Balasundaram S, Tanveer M (2013) On Lagrangian twin support vector regression. Neural Comput Appl 22(1):257–267
Balasundaram S, Tanveer M (2013) Smooth Newton method for implicit Lagrangian twin support vector regression. KES J 17(4):267–278
Brown MPS, Grundy WN, Lin D (2000) Knowledge-based analysis of micro-array gene expression data using support vector machine. Proc Natl Acad Sci USA 97(1):262–267
Cortes C, Vapnik VN (1995) Support vector networks. Mach Learn 20:273–297
Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30
Deng NY, Tian YJ, Zhang CH (2013) Support vector machines: optimization based theory, algorithms, and extensions. CRC Press, Boca Raton
Duda RO, Hart PR, Stork DG (2001) Pattern classification, 2nd edn. Wiley, New York
Fung G, Mangasarian OL (2001) Proximal support vector machines. In: Provost F, Srikant R, (eds) Proceedings KDD-2001: knowledge discovery and data mining, August 26–29, 2001, San Francisco, CA, New York, pp 76–86
Fung G, Mangasarian OL (2003) Finite Newton method for Lagrangian support vector machine classification. Neurocomputing 55(1–2):39–55
Gao S, Ye Q, Ye N (2011) 1-Norm least squares twin support vector machines. Neurocomputing 74:3590–3597
Hiriart-Urruty JB, Strodiot JJ, Nguyen VH (1984) Generalized Hessian matrix and second order optimality conditions for problems with CL1 data. Appl Math Optim 11:43–56
Jayadeva Khemchandani R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910
Joachims T (1999) Making large-scale support vector machine learning practical. In: Advances in kernel methods. In: Support vector learning. MIT Press, Cambridge
Joachims T, Ndellec C, Rouveriol C (1998) Text categorization with support vector machines: learning with many relevant features. Eur Conf Mach Learn Chemnitz Ger 10:137–142
Kumar MA, Gopal M (2008) Application of smoothing technique on twin support vector machines. Pattern Recogn Lett 29:1842–1848
Kumar MA, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36:7535–7543
Lee YJ, Mangasarian OL (2001a) RSVM: reduced support vector machines. In: Proceedings of the first SIAM international conference on data mining, pp 5–7
Lee YJ, Mangasarian OL (2001b) SSVM: a Smooth support vector machine for classification. Comput Optim Appl 20(1):5–22
Mangasarian OL (1994) Nonlinear programming. SIAM, Philadelphia
Mangasarian OL (2002) A finite Newton method for classification. Optim Methods Softw 17:913–929
Mangasarian OL, Musicant DR (1999) Successive overrelaxation for support vector machines. IEEE Trans Neural Netw 10:1032–1037
Mangasarian OL, Musicant DR (2001) Lagrangian support vector machines. J Mach Learn Res 1:161–177
Mangasarian OL, Solodov MV (1993) Nonlinear complementarity as unconstrained and constrained minimization. Math Program Ser B 62:277–297
Mangasarian OL, Wild EW (2006) Multisurface proximal support vector classification via generalized eigenvalues. IEEE Trans Pattern Anal Mach Intell 28(1):69–74
Murphy PM, Aha DW (1992) UCI repository of machine learning databases. University of California, Irvine. http://www.ics.uci.edu/ mlearn
Musicant D (1998) NDC: normally distributed clustered datasets. Computer Sciences Department, University of Wisconsin, Madison. www.cs.wisc.edu/dmi/svm/ndc
Nasiri JA, Charkari NM, Jalili S (2014) Least squares twin multi-class classification support vector machine. Pattern Recogn. doi:10.1016/j.patcog.2014.09.020
Osuna E, Freund R, Girosi F (1997) Training support vector machines: an application to face detection. In: Proceedings of computer vision and pattern recognition, pp 130–136
Peng X (2010) TSVR: an efficient twin support vector machine for regression. Neural Netw 23(3):365–372
Peng X, Xu D (2013) Robust minimum class variance twin support vector machine classifier. Neural Comput Appl 22(5):999–1011
Platt J (1999) Fast training of support vector machines using sequential minimal optimization. In: Scholkopf B, Burges CJC, Smola AJ (eds) Advances in kernel methods-support vector learning. MIT Press, Cambridge, pp 185–208
Qi Z, Tian Y, Shi Y (2012) Laplacian twin support vector machine for semi-supervised classification. Neural Netw 35:46–53
Ripley BD (2008) Pattern recognition and neural networks. Cambridge University Press, Cambridge
Scholkopf B, Smola A (2002) Learning with kernels. MIT Press, Cambridge
Shao Y-H, Zhang CH, Wang XB, Deng NY (2011) Improvements on twin support vector machines. IEEE Trans Neural Netw 22(6):962–968
Shao YH, Deng NY, Yang ZM, Chen WJ, Wang Z (2012) Probabilistic outputs for twin support vector machines. Knowl-Based Syst 33:145–151
Shao YH, Chen WJ, Deng NY (2014) Nonparallel hyperplane support vector machine for binary classification problems. Inf Sci 263:22–35
Tanveer M (2014) Application of smoothing techniques for linear programming twin support vector machines. Knowl Inf Syst. doi:10.1007/s10115-014-0786-3
Tanveer M (2015) Robust and sparse linear programming twin support vector machines. Cogn Comput 7(1):137–149
Tian Y, Ping Y (2014) Large-scale linear nonparallel support vector machine solver. Neural Netw 50:166–174
Tsang IW, Kwok JT, Cheung PM (2005) Core vector machines: fast SVM training on very large datasets. J Mach Learn Res 6:363–392
Vapnik VN (2000) The nature of statistical learning theory, 2nd edn. Springer, New York
Xu Y, Wang L (2012) A weighted twin support vector regression. Knowl-Based Syst 33:92–101
Zhong P, Xu Y, Zhao Y (2012) Training twin support vector regression via linear programming. Neural Comput Appl 21(2):399–407
Acknowledgments
The author acknowledges the valuable comments of the anonymous reviewers and the Editor of Journal of Machine Learning and Cybernetics whose enthusiasm is gladly appreciated. Also, the author would like to express his sincere gratitude to Prof. Yuh-Jye Lee, NTUST Taiwan and Prof. S. Balasundaram, JNU New Delhi, India for their help during the preparation of this manuscript.
Author information
Authors and Affiliations
Corresponding author
Additional information
This work was carried out at the School of Computer and Systems Sciences, Jawaharlal Nehru University, New Delhi, India, and Department of Computer Science and Engineering, The LNM Institute of Information Technology (LNMIIT) Jaipur, INDIA.
Rights and permissions
About this article
Cite this article
Tanveer, M. Newton method for implicit Lagrangian twin support vector machines. Int. J. Mach. Learn. & Cyber. 6, 1029–1040 (2015). https://doi.org/10.1007/s13042-015-0414-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-015-0414-x