Advertisement

Nonrigid Embeddings for Dimensionality Reduction

  • Matthew Brand
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3720)

Abstract

Spectral methods for embedding graphs and immersing data manifolds in low-dimensional spaces are notoriously unstable due to insufficient and/or numerically ill-conditioned constraint sets. Why show why this is endemic to spectral methods, and develop low-complexity solutions for stiffening ill-conditioned problems and regularizing ill-posed problems, with proofs of correctness. The regularization exploits sparse but complementary constraints on affine rigidity and edge lengths to obtain isometric embeddings. An implemented algorithm is fast, accurate, and industrial-strength: Experiments with problem sizes spanning four orders of magnitude show O(N) scaling. We demonstrate with speech data.

Keywords

Dimensionality Reduction Isometric Embedding Distance Constraint Locally Linear Embedding Constraint Matrix 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. [BN02]
    Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. In: Advances in Neural Information Processing Systems, vol. 14 (2002)Google Scholar
  2. [Bra03]
    Brand, M.: Charting a manifold. In: Advances in Neural Information Processing Systems, vol. 15 (2003)Google Scholar
  3. [Bra04]
    Brand, M.: From subspaces to submanifolds. In: Proceedings, British Machine Vision Conference (2004)Google Scholar
  4. [DG03]
    Donoho, D.L., Grimes, C.: Hessian eigenmaps. In: Proceedings, National Academy of Sciences (2003)Google Scholar
  5. [Kny01]
    Knyazev, A.V.: Toward the optimal preconditioned eigensolver. SIAM Journal on Scientific Computing 23(2), 517–541 (2001)zbMATHCrossRefMathSciNetGoogle Scholar
  6. [LLR95]
    Linial, N., London, E., Rabinovich, Y.: The geometry of graphs and some of its algorithmic applications. Combinatorica 15(2), 215–245 (1995)zbMATHCrossRefMathSciNetGoogle Scholar
  7. [LRS83]
    Levinson, S., Rabiner, L., Sondhi, M.: An introduction to the application of the theory of probabilistic functions of a markov process to automatic speech recognition. Bell System Technical Journal 62(4), 1035–1074 (1983)zbMATHMathSciNetGoogle Scholar
  8. [RS00]
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  9. [SS90]
    Stewart, G.W., Sun, J.-G.: Matrix perturbation theory. Academic Press, London (1990)zbMATHGoogle Scholar
  10. [Tut63]
    Tutte, W.T.: How to draw a graph. In: Proc. London Mathematical Society, vol. 13, pp. 743–768 (1963)Google Scholar
  11. [WPS05]
    Weinberger, K.Q., Packer, B.D., Saul, L.K.: Nonlinear dimensionality reduction by semidefinite programming and kernel matrix factorization. In: Proc. AI & Statistics (2005)Google Scholar
  12. [WSS04]
    Weinberger, K.Q., Sha, F., Saul, L.K.: Learning a kernel matrix for nonlinear dimensionality reduction. In: Proc. 21st ICML (2004)Google Scholar
  13. [ZZ03]
    Zhang, Z., Zha, H.: Nonlinear dimension reduction via local tangent space alignment. In: Liu, J., Cheung, Y.-m., Yin, H. (eds.) IDEAL 2003. LNCS, vol. 2690, pp. 477–481. Springer, Heidelberg (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Matthew Brand
    • 1
  1. 1.Mitsubishi Electric Research LabsCambridgeUSA

Personalised recommendations