Skip to main content

Spectral Measures for Kernel Matrices Comparison

  • Conference paper
Artificial Neural Networks – ICANN 2007 (ICANN 2007)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4668))

Included in the following conference series:

  • 2683 Accesses

Abstract

With the emergence of data fusion techniques (kernel combinations, ensemble methods and boosting algorithms), the task of comparing distance/similarity/kernel matrices is becoming increasingly relevant. However, the choice of an appropriate metric for matrices involved in pattern recognition problems is far from trivial.

In this work we propose a general spectral framework to build metrics for matrix spaces. Within the general framework of matrix pencils, we propose a new metric for symmetric and semi-positive definite matrices, called Pencil Distance (PD). The generality of our approach is demonstrated by showing that the Kernel Alignment (KA) measure is a particular case of our spectral approach.

We illustrate the performance of the proposed measures using some classification problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bach, F.R., Jordan, M.I.: Kernel Principal Components Analysis. Journal of Machine Learning Research 3, 1–48 (2002)

    Article  Google Scholar 

  2. Cristianini, N., Shawe-Taylor, J.: On the Kernel Target Alignment. Journal of Machine Learning Research 5, 27–72 (2002)

    Google Scholar 

  3. Epifanio, I., Gutierrez, J., Malo, J.: Linear transform for simultaneous diagonalization of covariance and perceptual metric matrix in image coding. Pattern Recognition 36, 799–1811 (2003)

    Article  Google Scholar 

  4. Golub, G., Loan, C.V.: Matrix Computations. University Press, Baltimore (1997)

    Google Scholar 

  5. Hotelling, H.: Relation between two Sets of Variables. Biometrika 28, 32177 (1936)

    Google Scholar 

  6. Hua, Y.: On SVD estimating Generalized Eigenvalues of Singular Matrix Pencils in Noise. IEEE Transactions on Signal Processing 39(4), 892–900 (1991)

    Article  Google Scholar 

  7. Joachims, T., Cristianini, N., Shawe-Taylor, J.: Composite Kernels for Hipertext Categorisasion. In: Proceedings of the International Conference of Machine Learning, pp. 250–257 (2002)

    Google Scholar 

  8. Lanckriet, G.R.G., Bartlett, P., Cristianini, N., Ghaoui, L., E y Jordan, M.I.: Learning the Kernel Matrix with Semidefinite Programming. Journal of Machine Learning Research 5, 27–72 (2002)

    Google Scholar 

  9. de Diego, I.M., Muñoz, A., Moguerza, J. M.: On the Combination of Kernels for Support Vector Classifiers. Working paper, 05-45-08, Statistics and Econometrics Series, University Carlos III (2005)

    Google Scholar 

  10. de Diego, I.M., Moguerza, J.M., Muñoz, A.: Combining Kernel Information for Support Vector Classification. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 102–111. Springer, Heidelberg (2004)

    Google Scholar 

  11. Moguerza, J., Muñoz, A.: Support Vector Machines with Applications. Statistical Science 21(3), 322–336 (2006)

    Article  Google Scholar 

  12. Omladic, M., Semrl, P.: On the distance between Normal Matrices. Proceedings of the American Mathematical Society 110, 591–596 (1990)

    Article  MATH  Google Scholar 

  13. Parlett, N.: Beresford. The Symmetric Eigenvalue Problem. Classics in Applied Mathematics. SIAM (1997)

    Google Scholar 

  14. Schölkopf, B., Smola, A., Muller, K.R.: Kernel Principal Component Analysis. In: Advanced in Kernel Methods-Support Vector Learning, pp. 327–352. MIT Press, Cambridge (1999)

    Google Scholar 

  15. Schölkopf, B., Smola, A.J., Müller, K.R.: Nonlinear Component Analysis as a Kernel Eigenvalue Problem. Neural Computation 10, 1299–1319 (1998)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Joaquim Marques de Sá Luís A. Alexandre Włodzisław Duch Danilo Mandic

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

González, J., Muñoz, A. (2007). Spectral Measures for Kernel Matrices Comparison. In: de Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D. (eds) Artificial Neural Networks – ICANN 2007. ICANN 2007. Lecture Notes in Computer Science, vol 4668. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74690-4_74

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74690-4_74

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74689-8

  • Online ISBN: 978-3-540-74690-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics