Discriminant Subspace Learning Based on Support Vectors Machines
A new method for dimensionality reduction and feature extraction based on Support Vector Machines and minimization of the within-class data dispersion is proposed. An iterative procedure is proposed that successively applies Support Vector Machines on perpendicular subspaces using the deflation transformation in such a way that the within-class variance is minimized. The proposed approach is proved to be a successive SVM using deflation kernels. The normal vectors of the successive hyperplanes contain discriminant information and they can be used as projection vectors for feature extraction and dimensionality reduction of the data. Experiments on various datasets are conducted in order to highlight the superior performance of the proposed algorithm.
KeywordsSupport Vector Machine Feature Extraction Linear Discriminant Analysis Scatter Matrix Discriminant Information
Unable to display preview. Download preview PDF.
- 1.Cover, T.M., Hart, P.E.: Nearest Neighbor Pattern Classification. IEEE Transactions In Information Theory, 21–26 (1967)Google Scholar
- 2.Duda, O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley (2001)Google Scholar
- 3.Scholkopf, B., Smola, A.: Learning with Kernels. MIT, Cambridge (2002)Google Scholar
- 4.Jolliffe, I.T.: Principal Component Analysis, 2nd edn. Springer (2002)Google Scholar
- 5.Pearson, K.: On Lines and Planes of Closest Fit to Systems of Points in Space. Philosophical Magazine 2, 559–572 (1901)Google Scholar
- 8.Alpaydin, E.: Introduction to Machine Learning. MIT Press (2004)Google Scholar
- 14.Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. In: Data Minining Knowledge Discovery, vol. 2, pp. 121–167 (1998)Google Scholar