A Leave-K-Out Cross-Validation Scheme for Unsupervised Kernel Regression

  • Stefan Klanke
  • Helge Ritter
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4132)


We show how to employ leave-K-out cross-validation in Unsupervised Kernel Regression, a recent method for learning of nonlinear manifolds. We thereby generalize an already present regularization method, yielding more flexibility without additional computational cost. We demonstrate our method on both toy and real data.


Latent Space Penalty Term Kernel Regression Projection Error Nonlinear Dimensionality Reduction 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Meinicke, P., Klanke, S., Memisevic, R., Ritter, H.: Principal surfaces from Unsupervised Kernel Regression. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(9), 1379–1391 (2005)CrossRefGoogle Scholar
  2. 2.
    Nadaraya, E.A.: On estimating regression. Theory of Probability and Its Application 10, 186–190 (1964)CrossRefGoogle Scholar
  3. 3.
    Watson, G.S.: Smooth regression analysis. Sankhya Series A 26, 359–372 (1964)MATHGoogle Scholar
  4. 4.
    Bishop, C.M., Svensen, M., Williams, C.K.I.: GTM: The Generative Topographic Mapping. Neural Computation 10(1), 215–234 (1998)CrossRefGoogle Scholar
  5. 5.
    Smola, A.J., Williamson, R.C., Mika, S., Schölkopf, B.: Regularized Principal Manifolds. In: Fischer, P., Simon, H.U. (eds.) EuroCOLT 1999. LNCS (LNAI), vol. 1572, pp. 214–229. Springer, Heidelberg (1999)CrossRefGoogle Scholar
  6. 6.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  7. 7.
    Belkin, M., Niyogi, P.: Laplacian Eigenmaps for dimensionality reduction and data representation. Neural Computation 15(6), 1373–1396 (2003)MATHCrossRefGoogle Scholar
  8. 8.
    Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)CrossRefGoogle Scholar
  9. 9.
    Klanke, S., Ritter, H.: Variants of Unsupervised Kernel Regression: General loss functions. In: Proc. European Symposium on Artificial Neural Networks (2006) (to appear)Google Scholar
  10. 10.
    Riedmiller, M., Braun, H.: A direct adaptive method for faster backpropagation learning: The RPROP algorithm. In: Proc. of the IEEE Intl. Conf. on Neural Networks, pp. 586–591 (1993)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Stefan Klanke
    • 1
  • Helge Ritter
    • 1
  1. 1.Neuroinformatics Group, Faculty of TechnologyUniversity of BielefeldBielefeldGermany

Personalised recommendations