Advertisement

Abstract

Similarity based classification methods use positive semi-definite (PSD) similarity matrices. When several data representations (or metrics) are available, they should be combined to build a single similarity matrix. Often the resulting combination is an indefinite matrix and can not be used to train the classifier. In this paper we introduce new methods to build a PSD matrix from an indefinite matrix. The obtained matrices are used as input kernels to train Support Vector Machines (SVMs) for classification tasks. Experimental results on artificial and real data sets are reported.

Keywords

Support Vector Machine Kernel Matrix Euclideanization Method Bending Method Input Kernel 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Blake, C.L., Merz, C.J.: UCI repository of Machine Learning databases. University of Carolina, Irvine, Department of Information and Computer Sciences (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
  2. 2.
    Cox, T.F., Cox, M.A.A.: Multidimensional Scaling. Chapman and Hall, Boca Raton (1994)MATHGoogle Scholar
  3. 3.
    Goldfarb, L.: A new approach to Pattern Recognition. Progress in Pattern Recognition 2, 241–402 (1985)MathSciNetGoogle Scholar
  4. 4.
    Hayes, J.F., Hill, W.G.: Modification of estimates of parameters in the construction of genetic selection indices (“Bending”). Biometrics 37, 483–493 (1981)CrossRefMathSciNetGoogle Scholar
  5. 5.
    Highman, N.: Computing the nearest correlation matrix- a problem from finance. IMA Journal of Numerical Analysis 22, 329–343 (2002)CrossRefMathSciNetGoogle Scholar
  6. 6.
    Jorjani, H., Klei, L., Emanuelson, U.: A simple method fot weighted bending of genetic (co)variance matrices. J. Dairy Sci. 86, 677–679 (2003)CrossRefGoogle Scholar
  7. 7.
    Luenberger, D.G.: Optimization by Vector Space Methods. Wiley, New York (1969)MATHGoogle Scholar
  8. 8.
    Martín de Diego, I., Moguerza, J.M., Muñoz, A.: Combining Kernel Information for Support Vector Classification. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 102–111. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  9. 9.
    de Diego, I.M., Muñoz, A., Moguerza, J.M.: Methods for the Combination of Kernel Matrices within a Support Vector Framework (submitted)Google Scholar
  10. 10.
    Moguerza, J.M., Muñoz, A., Martín de Diego, I.: Improving Support Vector Classification via the Combination of Multiple Sources of Information. In: Fred, A., Caelli, T.M., Duin, R.P.W., Campilho, A.C., de Ridder, D. (eds.) SSPR&SPR 2004. LNCS, vol. 3138, pp. 592–600. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  11. 11.
    Muñoz, A., Martín de Diego, I., Moguerza, J.M.: Support Vector Machine Classifiers for Assymetric Proximities. In: Kaynak, O., Alpaydın, E., Oja, E., Xu, L. (eds.) ICANN 2003 and ICONIP 2003. LNCS, vol. 2714, pp. 217–224. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  12. 12.
    Pekalska, E., Paclík, P., Duin, R.P.W.: A Generalized Kernel Approach to Dissimilarity-based Classification. JMLR, Special Issue on Kernel Methods 2(2), 175–211 (2002)MATHGoogle Scholar
  13. 13.
    Pekalska, E., Duin, R.P.W., Günter, S., Bunke, H.: On Not Making Dissimilarities Euclidean. In: Fred, A., Caelli, T.M., Duin, R.P.W., Campilho, A.C., de Ridder, D. (eds.) SSPR&SPR 2004. LNCS, vol. 3138, pp. 1145–1154. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  14. 14.
    von Neumann, J.: The Geometry of Orthogonal Spaces. Functional operators-vol. II. Annals of Math. Studies, vol. 22. Princeton University Press, Princeton (1950)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Alberto Muñoz
    • 1
  • Isaac Martí n de Diego
    • 2
  1. 1.University Carlos III de MadridGetafeSpain
  2. 2.University Rey Juan CarlosMóstolesSpain

Personalised recommendations