Abstract
A new method for dimensionality reduction and feature extraction based on Support Vector Machines and minimization of the within-class data dispersion is proposed. An iterative procedure is proposed that successively applies Support Vector Machines on perpendicular subspaces using the deflation transformation in such a way that the within-class variance is minimized. The proposed approach is proved to be a successive SVM using deflation kernels. The normal vectors of the successive hyperplanes contain discriminant information and they can be used as projection vectors for feature extraction and dimensionality reduction of the data. Experiments on various datasets are conducted in order to highlight the superior performance of the proposed algorithm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Cover, T.M., Hart, P.E.: Nearest Neighbor Pattern Classification. IEEE Transactions In Information Theory, 21–26 (1967)
Duda, O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley (2001)
Scholkopf, B., Smola, A.: Learning with Kernels. MIT, Cambridge (2002)
Jolliffe, I.T.: Principal Component Analysis, 2nd edn. Springer (2002)
Pearson, K.: On Lines and Planes of Closest Fit to Systems of Points in Space. Philosophical Magazine 2, 559–572 (1901)
Hotelling, H.: Analysis of a Complex of Statistical Variables into Principal Components. Journal of Educational Psychology 24, 417–441 (1933)
Scholkopf, B., Smola, A., Muller, K.R.: Nonlinear component analysis as a Kernel eigenvalue problem. Neural Computation 10, 1299–1319 (1998)
Alpaydin, E.: Introduction to Machine Learning. MIT Press (2004)
Juwei, L., Plataniotis, K.N., Venetsanopoulos, A.N.: Face recognition using Kernel direct discriminant analysis algorithms. IEEE Transactions on Neural Networks 14, 117–126 (2003)
Kocsor, A., Kovács, K., Szepesvári, C.: Margin Maximizing Discriminant Analysis. In: Boulicaut, J.-F., Esposito, F., Giannotti, F., Pedreschi, D. (eds.) ECML 2004. LNCS (LNAI), vol. 3201, pp. 227–238. Springer, Heidelberg (2004)
Tefas, A., Kotropoulos, C., Pitas, I.: Using Support Vector Machines to Enhance the Performance of Elastic Graph Matching for Frontal Face Authentication. IEEE Transactions on Pattern Analalysis and Machine Intelligence 23(7), 735–746 (2001)
Zafeiriou, S., Tefas, A., Pitas, I.: Minimum Class Variance Support Vector Machines. IEEE Transactions on Image Processing 16(10), 2551–2564 (2007)
Kung, S.Y., Diamantaras, K.I.: Neural networks for extracting unsymmetric principal components. In: Neural Networks for Signal Processing, pp. 50–59. IEEE Press, New York (1991)
Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. In: Data Minining Knowledge Discovery, vol. 2, pp. 121–167 (1998)
Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Fletcher, R.: Practical Methods of Optimization, 2nd edn. Wiley, New York (1987)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Pitelis, N., Tefas, A. (2012). Discriminant Subspace Learning Based on Support Vectors Machines. In: Perner, P. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2012. Lecture Notes in Computer Science(), vol 7376. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31537-4_16
Download citation
DOI: https://doi.org/10.1007/978-3-642-31537-4_16
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-31536-7
Online ISBN: 978-3-642-31537-4
eBook Packages: Computer ScienceComputer Science (R0)