, Volume 26, Issue 1, pp 4156
First online:
New Least Squares Support Vector Machines Based on Matrix Patterns
 Zhe WangAffiliated withDepartment of Computer Science & Engineering, Nanjing University of Aeronautics & Astronautics
 , Songcan ChenAffiliated withDepartment of Computer Science & Engineering, Nanjing University of Aeronautics & Astronautics Email author
Rent the article at a discount
Rent now* Final gross prices may vary according to local VAT.
Get AccessAbstract
Support vector machine (SVM), as an effective method in classification problems, tries to find the optimal hyperplane that maximizes the margin between two classes and can be obtained by solving a constrained optimization criterion using quadratic programming (QP). This QP leads to higher computational cost. Least squares support vector machine (LSSVM), as a variant of SVM, tries to avoid the above shortcoming and obtain an analytical solution directly from solving a set of linear equations instead of QP. Both SVM and LSSVM operate directly on patterns represented by vector, i.e., before applying SVM or LSSVM to a pattern, any nonvector pattern such as an image has to be first vectorized into a vector pattern by some techniques like concatenation. However, some implicit structural or local contextual information may be lost in this transformation. Moreover, as the dimension d of the weight vector in SVM or LSSVM with the linear kernel is equal to the dimension d _{1} × d _{2} of the original input pattern, as a result, the higher the dimension of a vector pattern is, the more space is needed for storing it. In this paper, inspired by the method of feature extraction directly based on matrix patterns and the advantages of LSSVM, we propose a new classifier design method based on matrix patterns, called MatLSSVM, such that the new method can not only directly operate on original matrix patterns, but also efficiently reduce memory for the weight vector (d) from d _{1} × d _{2} to d _{1} + d _{2}. However like LSSVM, MatLSSVM inherits LSSVM’s existence of unclassifiable regions when extended to multiclass problems. Thus with the fuzzy version of LSSVM, a corresponding fuzzy version of MatLSSVM (MatFLSSVM) is further proposed to remove unclassifiable regions effectively for multiclass problems. Experimental results on some benchmark datasets show that the proposed method is competitive in classification performance compared to LSSVM, fuzzy LSSVM (FLSSVM), morerecent MatPCA and MatFLDA. In addition, more importantly, the idea used here has a possibility of providing a novel way of constructing learning model.
Keywords
Support vector machine (SVM) Least squares support vector machine (LSSVM) Fuzzy least squares support vector machine (FLSSVM) Vector pattern Matrix pattern Pattern recognition Title
 New Least Squares Support Vector Machines Based on Matrix Patterns
 Journal

Neural Processing Letters
Volume 26, Issue 1 , pp 4156
 Cover Date
 200708
 DOI
 10.1007/s1106300790411
 Print ISSN
 13704621
 Online ISSN
 1573773X
 Publisher
 Kluwer Academic PublishersPlenum Publishers
 Additional Links
 Topics
 Keywords

 Support vector machine (SVM)
 Least squares support vector machine (LSSVM)
 Fuzzy least squares support vector machine (FLSSVM)
 Vector pattern
 Matrix pattern
 Pattern recognition
 Industry Sectors
 Authors

 Zhe Wang ^{(1)}
 Songcan Chen ^{(1)}
 Author Affiliations

 1. Department of Computer Science & Engineering, Nanjing University of Aeronautics & Astronautics, 29 Yudao Street, Nanjing, Jiangsu, 210016, China