New Least Squares Support Vector Machines Based on Matrix Patterns
- First Online:
- 219 Downloads
Support vector machine (SVM), as an effective method in classification problems, tries to find the optimal hyperplane that maximizes the margin between two classes and can be obtained by solving a constrained optimization criterion using quadratic programming (QP). This QP leads to higher computational cost. Least squares support vector machine (LS-SVM), as a variant of SVM, tries to avoid the above shortcoming and obtain an analytical solution directly from solving a set of linear equations instead of QP. Both SVM and LS-SVM operate directly on patterns represented by vector, i.e., before applying SVM or LS-SVM to a pattern, any non-vector pattern such as an image has to be first vectorized into a vector pattern by some techniques like concatenation. However, some implicit structural or local contextual information may be lost in this transformation. Moreover, as the dimension d of the weight vector in SVM or LS-SVM with the linear kernel is equal to the dimension d1 × d2 of the original input pattern, as a result, the higher the dimension of a vector pattern is, the more space is needed for storing it. In this paper, inspired by the method of feature extraction directly based on matrix patterns and the advantages of LS-SVM, we propose a new classifier design method based on matrix patterns, called MatLSSVM, such that the new method can not only directly operate on original matrix patterns, but also efficiently reduce memory for the weight vector (d) from d1 × d2 to d1 + d2. However like LS-SVM, MatLSSVM inherits LS-SVM’s existence of unclassifiable regions when extended to multi-class problems. Thus with the fuzzy version of LS-SVM, a corresponding fuzzy version of MatLSSVM (MatFLSSVM) is further proposed to remove unclassifiable regions effectively for multi-class problems. Experimental results on some benchmark datasets show that the proposed method is competitive in classification performance compared to LS-SVM, fuzzy LS-SVM (FLS-SVM), more-recent MatPCA and MatFLDA. In addition, more importantly, the idea used here has a possibility of providing a novel way of constructing learning model.
KeywordsSupport vector machine (SVM) Least squares support vector machine (LS-SVM) Fuzzy least squares support vector machine (FLS-SVM) Vector pattern Matrix pattern Pattern recognition
Unable to display preview. Download preview PDF.
- 3.Vapnik V (1998) The support vector method of function estimation. In: Suykens AK, Vandewalle J (eds) Nonlinear modeling: advanced black-box techniques. Kluwer Academic Publishers, Boston, pp 55–85Google Scholar
- 8.Wang H, Ahuja N (2005) Rank-R approximation of tensors: using image-as-matrix representation. IEEE Conference on Computer Vision and Pattern Recognition, 2005 (CVPR’05)Google Scholar
- 11.Ye J, Janardan R, Li Q (2004) Two-dimensional linear discriminant analysis, University of Minnesota. Adv Neural Inform Process Syst 17:1569–1576Google Scholar
- 14.KreBel UH-G (1999) Pairwise classification and support vector machines. In: Scholkopf B, Burges CJC, Smola AJ (eds) Advances in kernel methods: support vector learning. MIT Press, Cambridge, MA, pp 255–268Google Scholar
- 15.Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, CambridgeGoogle Scholar