Neural Processing Letters

, Volume 26, Issue 1, pp 41–56

New Least Squares Support Vector Machines Based on Matrix Patterns

Article

Abstract

Support vector machine (SVM), as an effective method in classification problems, tries to find the optimal hyperplane that maximizes the margin between two classes and can be obtained by solving a constrained optimization criterion using quadratic programming (QP). This QP leads to higher computational cost. Least squares support vector machine (LS-SVM), as a variant of SVM, tries to avoid the above shortcoming and obtain an analytical solution directly from solving a set of linear equations instead of QP. Both SVM and LS-SVM operate directly on patterns represented by vector, i.e., before applying SVM or LS-SVM to a pattern, any non-vector pattern such as an image has to be first vectorized into a vector pattern by some techniques like concatenation. However, some implicit structural or local contextual information may be lost in this transformation. Moreover, as the dimension d of the weight vector in SVM or LS-SVM with the linear kernel is equal to the dimension d1 × d2 of the original input pattern, as a result, the higher the dimension of a vector pattern is, the more space is needed for storing it. In this paper, inspired by the method of feature extraction directly based on matrix patterns and the advantages of LS-SVM, we propose a new classifier design method based on matrix patterns, called MatLSSVM, such that the new method can not only directly operate on original matrix patterns, but also efficiently reduce memory for the weight vector (d) from d1 × d2 to d1 + d2. However like LS-SVM, MatLSSVM inherits LS-SVM’s existence of unclassifiable regions when extended to multi-class problems. Thus with the fuzzy version of LS-SVM, a corresponding fuzzy version of MatLSSVM (MatFLSSVM) is further proposed to remove unclassifiable regions effectively for multi-class problems. Experimental results on some benchmark datasets show that the proposed method is competitive in classification performance compared to LS-SVM, fuzzy LS-SVM (FLS-SVM), more-recent MatPCA and MatFLDA. In addition, more importantly, the idea used here has a possibility of providing a novel way of constructing learning model.

Keywords

Support vector machine (SVM) Least squares support vector machine (LS-SVM) Fuzzy least squares support vector machine (FLS-SVM) Vector pattern Matrix pattern Pattern recognition 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Vapnik V (1995) The nature of statistical learning theory. Springer-Verlag, New YorkMATHGoogle Scholar
  2. 2.
    Vapnik V (1998) Statistical learning theory. John Wiley, New YorkMATHGoogle Scholar
  3. 3.
    Vapnik V (1998) The support vector method of function estimation. In: Suykens AK, Vandewalle J (eds) Nonlinear modeling: advanced black-box techniques. Kluwer Academic Publishers, Boston, pp 55–85Google Scholar
  4. 4.
    Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9:293–300CrossRefGoogle Scholar
  5. 5.
    Suykens JAK, Van Gestel T, De Brabanter J, De Moor B, Vandewalle J (2002) Least squares support vector machines. World Scientific, Singapore (ISBN 981-238-151-1)MATHGoogle Scholar
  6. 6.
    Beymer D, Poggio T (1996) Image representations for visual learning. Science 272:1905–1909CrossRefADSGoogle Scholar
  7. 7.
    Chen SC, Zhu YL, Zhang DQ, Yang JY (2005) Feature extraction approaches based on matrix pattern: MatPCA and MatFLDA. Pattern Recog Lett 26:1157–1167CrossRefGoogle Scholar
  8. 8.
    Wang H, Ahuja N (2005) Rank-R approximation of tensors: using image-as-matrix representation. IEEE Conference on Computer Vision and Pattern Recognition, 2005 (CVPR’05)Google Scholar
  9. 9.
    Suykens JAK, De Brabanter J, Lukas L, Vandewalle J (2002) Weighted least squares support vector machines: robustness and sparse approximation. Neurocomputing 48:85–105MATHCrossRefGoogle Scholar
  10. 10.
    Yang J, Zhang D, Frangi AF, Yang J-U (2004) Two-dimension PCA: a new approach to appearance-based face representation and recognition. IEEE Trans Pattern Analysis Machine Intelligence 26(1):131–137CrossRefGoogle Scholar
  11. 11.
    Ye J, Janardan R, Li Q (2004) Two-dimensional linear discriminant analysis, University of Minnesota. Adv Neural Inform Process Syst 17:1569–1576Google Scholar
  12. 12.
    Li M, Yuan B (2005) 2D-LDA: a statistical linear discriminant analysis for image matrix. Pattern Recognition Lett 26:527–532CrossRefGoogle Scholar
  13. 13.
    Tsujinishi D, Abe S (2003) Fuzzy least squares support vector machines for multiclass problems. Neural Netw 16(5–6):785–792CrossRefGoogle Scholar
  14. 14.
    KreBel UH-G (1999) Pairwise classification and support vector machines. In: Scholkopf B, Burges CJC, Smola AJ (eds) Advances in kernel methods: support vector learning. MIT Press, Cambridge, MA, pp 255–268Google Scholar
  15. 15.
    Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, CambridgeGoogle Scholar
  16. 16.
    Graham A (1981) Kronecker products and matrix calculus: with applications. Halsted Press, John Wiley and Sons, NYMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2007

Authors and Affiliations

  1. 1.Department of Computer Science & EngineeringNanjing University of Aeronautics & AstronauticsNanjingChina

Personalised recommendations