Abstract
With the emergence of data fusion techniques (kernel combinations, ensemble methods and boosting algorithms), the task of comparing distance/similarity/kernel matrices is becoming increasingly relevant. However, the choice of an appropriate metric for matrices involved in pattern recognition problems is far from trivial.
In this work we propose a general spectral framework to build metrics for matrix spaces. Within the general framework of matrix pencils, we propose a new metric for symmetric and semi-positive definite matrices, called Pencil Distance (PD). The generality of our approach is demonstrated by showing that the Kernel Alignment (KA) measure is a particular case of our spectral approach.
We illustrate the performance of the proposed measures using some classification problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bach, F.R., Jordan, M.I.: Kernel Principal Components Analysis. Journal of Machine Learning Research 3, 1–48 (2002)
Cristianini, N., Shawe-Taylor, J.: On the Kernel Target Alignment. Journal of Machine Learning Research 5, 27–72 (2002)
Epifanio, I., Gutierrez, J., Malo, J.: Linear transform for simultaneous diagonalization of covariance and perceptual metric matrix in image coding. Pattern Recognition 36, 799–1811 (2003)
Golub, G., Loan, C.V.: Matrix Computations. University Press, Baltimore (1997)
Hotelling, H.: Relation between two Sets of Variables. Biometrika 28, 32177 (1936)
Hua, Y.: On SVD estimating Generalized Eigenvalues of Singular Matrix Pencils in Noise. IEEE Transactions on Signal Processing 39(4), 892–900 (1991)
Joachims, T., Cristianini, N., Shawe-Taylor, J.: Composite Kernels for Hipertext Categorisasion. In: Proceedings of the International Conference of Machine Learning, pp. 250–257 (2002)
Lanckriet, G.R.G., Bartlett, P., Cristianini, N., Ghaoui, L., E y Jordan, M.I.: Learning the Kernel Matrix with Semidefinite Programming. Journal of Machine Learning Research 5, 27–72 (2002)
de Diego, I.M., Muñoz, A., Moguerza, J. M.: On the Combination of Kernels for Support Vector Classifiers. Working paper, 05-45-08, Statistics and Econometrics Series, University Carlos III (2005)
de Diego, I.M., Moguerza, J.M., Muñoz, A.: Combining Kernel Information for Support Vector Classification. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 102–111. Springer, Heidelberg (2004)
Moguerza, J., Muñoz, A.: Support Vector Machines with Applications. Statistical Science 21(3), 322–336 (2006)
Omladic, M., Semrl, P.: On the distance between Normal Matrices. Proceedings of the American Mathematical Society 110, 591–596 (1990)
Parlett, N.: Beresford. The Symmetric Eigenvalue Problem. Classics in Applied Mathematics. SIAM (1997)
Schölkopf, B., Smola, A., Muller, K.R.: Kernel Principal Component Analysis. In: Advanced in Kernel Methods-Support Vector Learning, pp. 327–352. MIT Press, Cambridge (1999)
Schölkopf, B., Smola, A.J., Müller, K.R.: Nonlinear Component Analysis as a Kernel Eigenvalue Problem. Neural Computation 10, 1299–1319 (1998)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
González, J., Muñoz, A. (2007). Spectral Measures for Kernel Matrices Comparison. In: de Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D. (eds) Artificial Neural Networks – ICANN 2007. ICANN 2007. Lecture Notes in Computer Science, vol 4668. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74690-4_74
Download citation
DOI: https://doi.org/10.1007/978-3-540-74690-4_74
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-74689-8
Online ISBN: 978-3-540-74690-4
eBook Packages: Computer ScienceComputer Science (R0)