Advertisement

Spectral Dimensionality Reduction

  • Harry Strange
  • Reyer Zwiggelaar
Chapter
Part of the SpringerBriefs in Computer Science book series (BRIEFSCOMPUTER)

Abstract

In this chapter a common mathematical framework is provided which forms the basis for subsequent chapters. Generic aspects are covered, after which specific dimensionality reduction approaches are briefly described.

Keywords

Spectral dimensionality reduction algorithms Kernel methods  Spectral graph theory. 

References

  1. 1.
    Joliffe, I.T.: Principal Component Analysis. Springer-Verlag, New York (1986)Google Scholar
  2. 2.
    Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2322 (2000)Google Scholar
  3. 3.
    Hotelling, H.: Analysis of a complex of statistical variables into principal components. Journal of Educational Psychology 24, 417–441 (1933)Google Scholar
  4. 4.
    Cox, T.F., Cox, M.A.A.: Multidimensional Scaling. Chapman and Hall (2001)Google Scholar
  5. 5.
    Saul, L.K., Weinberger, K.Q., Ham, J.H., Sha, F., Lee, D.D.: Semisupervised Learning, chap. Spectral Methods for Dimensionality Reduction, pp. 293–308. MIT Press, Cambridge, MA (2006)Google Scholar
  6. 6.
    Rowlinson, P.: Graph Connections: Relationships Between Graph Theory and other Areas of Mathematics, chap. Linear Algebra, pp. 86–99. Oxford Science Publications (1997)Google Scholar
  7. 7.
    Dijkstra, E.W.: A note on two problems in connexion with graphs. Numerische Mathematik 1, 269–271 (1959)Google Scholar
  8. 8.
    Weinberger, K.Q., Packer, B.D., Saul, L.K.: Nonlinear dimensionality reduction by semidefinite programming and kernel matrix factorization. In: In Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics, pp. 381–388 (2005)Google Scholar
  9. 9.
    Vandenberghe, L., Boyd, S.P.: Semidefinite programming. SIAM Review 38(1), 49–95 (1996)Google Scholar
  10. 10.
    Weinberger, K.Q., Sha, F., Saul, L.K.: Learning a kernel matrix for nonlinear dimensionality reduction. In: In Proceedings of the 21st International Conference on Machine Learning, pp. 839–846 (2004)Google Scholar
  11. 11.
    Lafon, S., Lee, A.B.: Diffusion maps and coarse-graining: A unified framework for dimensionality reduction, graph partitioning, and data set parameterization. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(9), 1393–1403 (2006)Google Scholar
  12. 12.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)Google Scholar
  13. 13.
    Kambhatla, N., Leen, T.K.: Dimension reduction by local Principal Components Analysis. Neural Computation 9(7), 1493–1516 (1994)Google Scholar
  14. 14.
    Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Advances in Neural Information Processing Systems 14: Proceedings of the 2002 Conference (NIPS), pp. 585–591 (2002)Google Scholar
  15. 15.
    He, X., Niyogi, P.: Locality Preserving Projections. In: Advances in Neural Information Processing Systems 16: Proceedings of the 2003 Conference (NIPS), pp. 153–160. MIT Press (2003)Google Scholar
  16. 16.
    Zhang, Z., Zha, H.: Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM Journal on Scientific Computing 26(1), 313–338 (2004)Google Scholar
  17. 17.
    Yan, S., Xu, D., Zhang, B., Zhang, H.J., Yang, Q., Lin, S.: Graph embedding: A general framework for dimensionality reduction. IEEE Transactions on Pattern Analysis and Machine Intelligence 29(1), 40–51 (2007)Google Scholar
  18. 18.
    Lawrence, N.D.: A unifying probabilistic perspective for spectral dimensionality reduction: Insights and new models. Journal of Machine Learning Research 13, 1609–1638 (2012)Google Scholar
  19. 19.
    Ham, J., Lee, D.D., Mika, S., Schölkopf, B.: A kernel view of the dimensionality reduction of manifolds. In: In Proceedings of the 21st International Conference on Machine Learning, pp. 47–55 (2004)Google Scholar
  20. 20.
    Scholkopf, B., Smola, E., Bottou, L., Burges, C., Bultho, H., Gegenfurtner, K., Ner, P.H.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10, 1299–1319 (1998)Google Scholar
  21. 21.
    Ma, L., Crawford, M.M., Tian, J.W.: Generalised supervised local tangent space alignment for hyperspectral image classification. Electronics Letters 46(7), 497–498 (2010)Google Scholar

Copyright information

© The Author(s) 2014

Authors and Affiliations

  1. 1.Department of Computer ScienceAberystwyth UniversityAberystwythUK

Personalised recommendations