Abstract
Spectral methods for embedding graphs and immersing data manifolds in low-dimensional spaces are notoriously unstable due to insufficient and/or numerically ill-conditioned constraint sets. Why show why this is endemic to spectral methods, and develop low-complexity solutions for stiffening ill-conditioned problems and regularizing ill-posed problems, with proofs of correctness. The regularization exploits sparse but complementary constraints on affine rigidity and edge lengths to obtain isometric embeddings. An implemented algorithm is fast, accurate, and industrial-strength: Experiments with problem sizes spanning four orders of magnitude show O(N) scaling. We demonstrate with speech data.
Chapter PDF
Similar content being viewed by others
Keywords
- Dimensionality Reduction
- Isometric Embedding
- Distance Constraint
- Locally Linear Embedding
- Constraint Matrix
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. In: Advances in Neural Information Processing Systems, vol. 14 (2002)
Brand, M.: Charting a manifold. In: Advances in Neural Information Processing Systems, vol. 15 (2003)
Brand, M.: From subspaces to submanifolds. In: Proceedings, British Machine Vision Conference (2004)
Donoho, D.L., Grimes, C.: Hessian eigenmaps. In: Proceedings, National Academy of Sciences (2003)
Knyazev, A.V.: Toward the optimal preconditioned eigensolver. SIAM Journal on Scientific Computing 23(2), 517–541 (2001)
Linial, N., London, E., Rabinovich, Y.: The geometry of graphs and some of its algorithmic applications. Combinatorica 15(2), 215–245 (1995)
Levinson, S., Rabiner, L., Sondhi, M.: An introduction to the application of the theory of probabilistic functions of a markov process to automatic speech recognition. Bell System Technical Journal 62(4), 1035–1074 (1983)
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)
Stewart, G.W., Sun, J.-G.: Matrix perturbation theory. Academic Press, London (1990)
Tutte, W.T.: How to draw a graph. In: Proc. London Mathematical Society, vol. 13, pp. 743–768 (1963)
Weinberger, K.Q., Packer, B.D., Saul, L.K.: Nonlinear dimensionality reduction by semidefinite programming and kernel matrix factorization. In: Proc. AI & Statistics (2005)
Weinberger, K.Q., Sha, F., Saul, L.K.: Learning a kernel matrix for nonlinear dimensionality reduction. In: Proc. 21st ICML (2004)
Zhang, Z., Zha, H.: Nonlinear dimension reduction via local tangent space alignment. In: Liu, J., Cheung, Y.-m., Yin, H. (eds.) IDEAL 2003. LNCS, vol. 2690, pp. 477–481. Springer, Heidelberg (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Brand, M. (2005). Nonrigid Embeddings for Dimensionality Reduction. In: Gama, J., Camacho, R., Brazdil, P.B., Jorge, A.M., Torgo, L. (eds) Machine Learning: ECML 2005. ECML 2005. Lecture Notes in Computer Science(), vol 3720. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11564096_10
Download citation
DOI: https://doi.org/10.1007/11564096_10
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-29243-2
Online ISBN: 978-3-540-31692-3
eBook Packages: Computer ScienceComputer Science (R0)