Model Selection in Kernel Methods Based on a Spectral Analysis of Label Information
We propose a novel method for addressing the model selection problem in the context of kernel methods. In contrast to existing methods which rely on hold-out testing or try to compensate for the optimism of the generalization error, our method is based on a structural analysis of the label information using the eigenstructure of the kernel matrix. In this setting, the label vector can be transformed into a representation in which the smooth information is easily discernible from the noise. This permits to estimate a cut-off dimension such that the leading coefficients in that representation contains the learnable information, discarding the noise. Based on this cut-off dimension, the regularization parameter is estimated for kernel ridge regression.
Unable to display preview. Download preview PDF.
- 3.Blanchard, G.: Statistical properties of kernel principal component analysis. Machine Learning (2006)Google Scholar
- 5.Zwald, L., Blanchard, G.: On the convergence of eigenspaces in kernel principal component analysis. In: NIPS 2005 (2005)Google Scholar
- 6.Braun, M.L.: Spectral Properties of the Kernel Matrix and their Application to Kernel Methods in Machine Learning. PhD thesis, University of Bonn, published electronically (2005), available at: http://hss.ulb.uni-bonn.de/diss_online/math_nat_fak/2005/braun_mikio
- 7.Vapnik, V.: Statistical Learning Theory. J. Wiley, Chichester (1998)Google Scholar
- 8.Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines and other kernel-based learning methods. Cambridge University Press, Cambridge (2000)Google Scholar
- 9.Williams, C.K.I., Rasmussen, C.E.: Gaussian processes for regression. In: Touretzky, D.S., Mozer, M.C., Hasselmo, M.E. (eds.) Advances in Neural Information Processing Systems 8. MIT Press, Cambridge (1996)Google Scholar
- 10.Wahba, G.: Spline Models For Observational Data. Society for Industrial and Applied Mathematics (1990)Google Scholar