Abstract
Similarity based classification methods use positive semi-definite (PSD) similarity matrices. When several data representations (or metrics) are available, they should be combined to build a single similarity matrix. Often the resulting combination is an indefinite matrix and can not be used to train the classifier. In this paper we introduce new methods to build a PSD matrix from an indefinite matrix. The obtained matrices are used as input kernels to train Support Vector Machines (SVMs) for classification tasks. Experimental results on artificial and real data sets are reported.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Download to read the full chapter text
Chapter PDF
References
Blake, C.L., Merz, C.J.: UCI repository of Machine Learning databases. University of Carolina, Irvine, Department of Information and Computer Sciences (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
Cox, T.F., Cox, M.A.A.: Multidimensional Scaling. Chapman and Hall, Boca Raton (1994)
Goldfarb, L.: A new approach to Pattern Recognition. Progress in Pattern Recognition 2, 241–402 (1985)
Hayes, J.F., Hill, W.G.: Modification of estimates of parameters in the construction of genetic selection indices (“Bending”). Biometrics 37, 483–493 (1981)
Highman, N.: Computing the nearest correlation matrix- a problem from finance. IMA Journal of Numerical Analysis 22, 329–343 (2002)
Jorjani, H., Klei, L., Emanuelson, U.: A simple method fot weighted bending of genetic (co)variance matrices. J. Dairy Sci. 86, 677–679 (2003)
Luenberger, D.G.: Optimization by Vector Space Methods. Wiley, New York (1969)
Martín de Diego, I., Moguerza, J.M., Muñoz, A.: Combining Kernel Information for Support Vector Classification. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 102–111. Springer, Heidelberg (2004)
de Diego, I.M., Muñoz, A., Moguerza, J.M.: Methods for the Combination of Kernel Matrices within a Support Vector Framework (submitted)
Moguerza, J.M., Muñoz, A., Martín de Diego, I.: Improving Support Vector Classification via the Combination of Multiple Sources of Information. In: Fred, A., Caelli, T.M., Duin, R.P.W., Campilho, A.C., de Ridder, D. (eds.) SSPR&SPR 2004. LNCS, vol. 3138, pp. 592–600. Springer, Heidelberg (2004)
Muñoz, A., Martín de Diego, I., Moguerza, J.M.: Support Vector Machine Classifiers for Assymetric Proximities. In: Kaynak, O., Alpaydın, E., Oja, E., Xu, L. (eds.) ICANN 2003 and ICONIP 2003. LNCS, vol. 2714, pp. 217–224. Springer, Heidelberg (2003)
Pekalska, E., Paclík, P., Duin, R.P.W.: A Generalized Kernel Approach to Dissimilarity-based Classification. JMLR, Special Issue on Kernel Methods 2(2), 175–211 (2002)
Pekalska, E., Duin, R.P.W., Günter, S., Bunke, H.: On Not Making Dissimilarities Euclidean. In: Fred, A., Caelli, T.M., Duin, R.P.W., Campilho, A.C., de Ridder, D. (eds.) SSPR&SPR 2004. LNCS, vol. 3138, pp. 1145–1154. Springer, Heidelberg (2004)
von Neumann, J.: The Geometry of Orthogonal Spaces. Functional operators-vol. II. Annals of Math. Studies, vol. 22. Princeton University Press, Princeton (1950)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Muñoz, A., de Diego, I.M.n. (2006). From Indefinite to Positive Semi-Definite Matrices. In: Yeung, DY., Kwok, J.T., Fred, A., Roli, F., de Ridder, D. (eds) Structural, Syntactic, and Statistical Pattern Recognition. SSPR /SPR 2006. Lecture Notes in Computer Science, vol 4109. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11815921_84
Download citation
DOI: https://doi.org/10.1007/11815921_84
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-37236-3
Online ISBN: 978-3-540-37241-7
eBook Packages: Computer ScienceComputer Science (R0)