LDR-LLE: LLE with Low-Dimensional Neighborhood Representation

  • Yair Goldberg
  • Ya’acov Ritov
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5359)


The local linear embedding algorithm (LLE) is a non-linear dimension-reducing technique that is widely used for its computational simplicity and intuitive approach. LLE first linearly reconstructs each input point from its nearest neighbors and then preserves these neighborhood relations in a low-dimensional embedding. We show that the reconstruction weights computed by LLE capture the high-dimensional structure of the neighborhoods, and not the low-dimensional manifold structure. Consequently, the weight vectors are highly sensitive to noise. Moreover, this causes LLE to converge to a linear projection of the input, as opposed to its non-linear embedding goal. To resolve both of these problems, we propose to compute the weight vectors using a low-dimensional neighborhood representation. We call this technique LDR-LLE. We present numerical examples of the perturbation and linear projection problems, and of the improved outputs resulting from the low-dimensional neighborhood representation.


Weight Vector Reconstruction Error Input Point Linear Projection Lighting Direction 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  2. 2.
    Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)CrossRefGoogle Scholar
  3. 3.
    Belkin, M., Niyogi, P.: Laplacian Eigenmaps for dimensionality reduction and data representation. Neural. Comp. 15, 1373–1396 (2003)CrossRefzbMATHGoogle Scholar
  4. 4.
    Donoho, D.L., Grimes, C.: Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proc. Natl. Acad. Sci. U.S.A. 100, 5591–5596 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Zhang, Z.Y., Zha, H.Y.: Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM J. Sci. Comp. 26, 313–338 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Weinberger, K.Q., Saul, L.K.: Unsupervised learning of image manifolds by semidefinite programming. International Journal of Computer Vision 70(1), 77–90 (2006)CrossRefGoogle Scholar
  7. 7.
    Xu, W., Lifang, X., Dan, Y., Zhiyan, H.: Speech visualization based on locally linear embedding (lle) for the hearing impaired. In: BMEI (2), pp. 502–505 (2008)Google Scholar
  8. 8.
    Shi, R., Shen, I.F., Chen, W.: Image denoising through locally linear embedding. In: CGIV 2005: Proceedings of the International Conference on Computer Graphics, Imaging and Visualization, pp. 147–152. IEEE Computer Society, Los Alamitos (2005)Google Scholar
  9. 9.
    Chen, J., Wang, R., Yan, S., Shan, S., Chen, X., Gao, W.: Enhancing human face detection by resampling examples through manifolds. IEEE Transactions on Systems, Man and Cybernetics, Part A 37, 1017–1028 (2007)CrossRefGoogle Scholar
  10. 10.
    L’Heureux, P., Carreau, J., Bengio, Y., Delalleau, O., Yue, S.: Locally linear embedding for dimensionality reduction in qsar. J. Comput. Aided Mol. Des. 18, 475–482 (2004)CrossRefGoogle Scholar
  11. 11.
    Wang, M., Yang, H., Xu, Z., Chou, K.: SLLE for predicting membrane protein types. J. Theor. Biol. 232, 7–15 (2005)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Xu, X., Wu, F., Hu, Z., Luo, A.: A novel method for the determination of redshifts of normal galaxies by non-linear dimensionality reduction. Spectroscopy and Spectral Analysis 26, 182–186 (2006)Google Scholar
  13. 13.
    Hadid, A., Pietikäinen, M.: Efficient locally linear embeddings of imperfect manifolds. Machine Learning and Data Mining in Pattern Recognition, 188–201 (2003)Google Scholar
  14. 14.
    Chang, H., Yeung, D.Y.: Robust locally linear embedding. Pattern Recognition 39, 1053–1065 (2006)CrossRefzbMATHGoogle Scholar
  15. 15.
    Varini, C., Degenhard, A., Nattkemper, T.W.: ISOLLE: LLE with geodesic distance. Neurocomputing 69, 1768–1771 (2006)CrossRefGoogle Scholar
  16. 16.
    Wang, H., Zheng, J., Yao, Z., Li, L.: Improved locally linear embedding through new distance computing. In: Advances in Neural Networks - ISNN 2006, pp. 1326–1333 (2006)Google Scholar
  17. 17.
    Zhang, Z., Wang, J.: MLLE: Modified locally linear embedding using multiple weights. In: Schölkopf, B., Platt, J., Hoffman, T. (eds.) Advances in Neural Information Processing Systems, vol. 19, pp. 1593–1600. MIT Press, Cambridge (2007)Google Scholar
  18. 18.
    Saul, L.K., Roweis, S.T.: Think globally, fit locally: Unsupervised learning of low-dimensional manifolds. J. Mach. Learn. Res. 4, 119–155 (2003)MathSciNetzbMATHGoogle Scholar
  19. 19.
    Lee, J.A., Verleysen, M.: Nonlinear Dimensionality Reduction. Springer, Berlin (2007)CrossRefzbMATHGoogle Scholar
  20. 20.
    Saul, L.K., Roweis, S.T.: Locally linear embedding (LLE) website,
  21. 21.
    Golub, G.H., Loan, C.F.V.: Matrix Computations. Johns Hopkins University Press, Baltimore (1983)zbMATHGoogle Scholar
  22. 22.
    Wu, F., Hu, Z.: The LLE and a linear mapping. Pattern Recognition 39, 1799–1804 (2006)CrossRefzbMATHGoogle Scholar
  23. 23.
    Tenenbaum, J.B., de Silva, V., Langford, J.C.: Isomap website,
  24. 24.
    Goldberg, Y., Zakai, A., Kushnir, D., Ritov, Y.: Manifold learning: The price of normalization. J. Mach. Learn. Res. 9, 1909–1939 (2008)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Yair Goldberg
    • 1
  • Ya’acov Ritov
    • 1
  1. 1.Department of Statistics and The Center for the Study of RationalityHebrew UniversityJerusalemIsrael

Personalised recommendations