A new approach for training Lagrangian twin support vector machine via unconstrained convex minimization
- 277 Downloads
In this paper, a novel unconstrained convex minimization problem formulation for the Lagrangian dual of the recently introduced twin support vector machine (TWSVM) in simpler form is proposed for constructing binary classifiers. Since the objective functions of the modified minimization problems contain non-smooth ‘plus’ function, we solve them by Newton iterative method either by considering their generalized Hessian matrices or replacing the ‘plus’ function by a smooth approximation function. Numerical experiments were performed on a number of interesting real-world benchmark data sets. Computational results clearly illustrates the effectiveness and the applicability of the proposed approach as comparable or better generalization performance with faster learning speed is obtained in comparison with SVM, least squares TWSVM (LS-TWSVM) and TWSVM.
KeywordsGeneralized Hessian approach Smooth approximation formulation Twin support vector machine
The authors are thankful to the anonymous reviewers for their comments.
- 6.Golub GH, Van Loan CF (1996) Matrix computations, 3rd ed., The Johns Hopkins University PressGoogle Scholar
- 10.Joachims T, Ndellec C, Rouveriol (1998) Text categorization with support vector machines: learning with many relevant features. In: European conference on machine learning, no.10, Chemnitz, Germany, pp 137–142Google Scholar
- 18.Osuna E, Freund R, Girosi F (1997) Training support vector machines: an application to face detection. In: Proceedings of Computer Vision and Pattern Recognition, pp 130–136Google Scholar
- 19.Platt J (1999) Fast training of support vector machines using sequential minimal optimization. In: Scholkopf B, Burges CJC, Smola AJ (Ed.), Advances in kernel methods- support vector learning, MIT press, Cambridge, MA, pp 185–208Google Scholar