Skip to main content

Generalization Bounds for Subspace Selection and Hyperbolic PCA

  • Conference paper
Subspace, Latent Structure and Feature Selection (SLSFS 2005)

Abstract

We present a method which uses example pairs of equal or unequal class labels to select a subspace with near optimal metric properties in a kernel-induced Hilbert space. A representation of finite dimensional projections as bounded linear functionals on a space of Hilbert-Schmidt operators leads to PAC-type performance guarantees for the resulting feature maps. The proposed algorithm returns the projection onto the span of the principal eigenvectors of an empirical operator constructed in terms of the example pairs. It can be applied to meta-learning environments and experiments demonstrate an effective transfer of knowledge between different but related learning tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bartlett, P.L., Mendelson, S.: Rademacher and Gaussian Complexities: Risk Bounds and Structural Results. Journal of Machine Learning Research (2002)

    Google Scholar 

  2. Bartlett, P., Bousquet, O., Mendelson, S.: Local Rademacher complexities, Available online: http://www.stat.berkeley.edu/~bartlett/papers/bbm-lrc-02b.pdf

  3. Bar-Hillel, A., Hertz, T., Shental, N., Weinshall, D.: Learning a Mahalanobis Metric from Equivalence Constraints. Journal of Machine Learning Research 6, 937–965 (2005)

    MathSciNet  MATH  Google Scholar 

  4. Baxter, J.: A Model of Inductive Bias Learning. Journal of Artificial Intelligence Research 12, 149–198 (2000)

    MathSciNet  MATH  Google Scholar 

  5. Cristianini, N., Shawe-Taylor, J.: Support Vector Machines. Cambridge University Press, Cambridge (2000)

    Book  MATH  Google Scholar 

  6. Hammer, R., Hertz, T., Hochstein, S., Weinshall, D.: Category learning from equivalence constraints. In: XXVII Conference of Cognitive Science Society (CogSci 2005) (available online)

    Google Scholar 

  7. Koltchinskii, V., Panchenko, D.: Empirical margin distributions and bounding the generalization error of combined classifiers. The Annals of Statistics 30(1), 1–50

    Google Scholar 

  8. Ledoux, M., Talagrand, M.: Probability in Banach Spaces: isoperimetry and processes. Springer, Heidelberg (1991)

    Book  MATH  Google Scholar 

  9. McDiarmid, C.: Concentration. In: Probabilistic Methods of Algorithmic Discrete Mathematics, pp. 195–248. Springer, Berlin (1998)

    Chapter  Google Scholar 

  10. Mika, S., Schölkopf, B., Smola, A., Müller, K.-R., Scholz, M., Rätsch, G.: Kernel PCA and De-noising in Feature Spaces. Advances in Neural Information Processing Systems 11 (1998)

    Google Scholar 

  11. Shawe-Taylor, J., Christianini, N.: Estimating the moments of a random vector. In: Proceedings of GRETSI 2003 Conference, vol. I, pp. 47–52 (2003)

    Google Scholar 

  12. Reed, M., Simon, B.: Functional Analysis, part I of Methods of Mathematical Physics. Academic Press, London (1980)

    MATH  Google Scholar 

  13. Robins, A.: Transfer in Cognition. In: Thrun, S., Pratt, L. (eds.) Learning to Learn. Springer, Heidelberg (1998)

    Google Scholar 

  14. Shawe-Taylor, J., Williams, C.K.I., Cristianini, N., Kandola, J.S.: On the eigenspectrum of the gram matrix and the generalization error of kernel-PCA. IEEE Transactions on Information Theory 51(7), 2510–2522 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  15. Thrun, S.: Lifelong Learning Algorithms. In: Thrun, S., Pratt, L. (eds.) Learning to Learn. Springer, Heidelberg (1998)

    Chapter  Google Scholar 

  16. Xing, E.P., Ng, A.Y., Jordan, M.I., Russel, S.: Distance metric learning, with application to clustering with side information. In: Becker, S., Thrun, S., Obermayer, K. (eds.) Advances in Neural Information Processing Systems, vol. 14, MIT Press, Cambridge (2002)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Maurer, A. (2006). Generalization Bounds for Subspace Selection and Hyperbolic PCA. In: Saunders, C., Grobelnik, M., Gunn, S., Shawe-Taylor, J. (eds) Subspace, Latent Structure and Feature Selection. SLSFS 2005. Lecture Notes in Computer Science, vol 3940. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11752790_13

Download citation

  • DOI: https://doi.org/10.1007/11752790_13

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-34137-6

  • Online ISBN: 978-3-540-34138-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics