Abstract
Hessian locally linear embedding (HLLE) achieves linear embedding by minimizing the Hessian functional on the manifold where the data set resides. The conceptual framework of HLLE may be viewed as a modification of the Laplacian Eigenmaps framework. Let H be the observed high-dimensional data which reside on a low-dimentional manifold M and h be the coordinate mapping on M so that Y = h(H)is a DR of H. In Laplacian eigenmaps method, h is found in the numerically null space of the Laplace-Beltrsmi operator on M, while in Hessian locally linear embedding, it is found in the null space of the Hessian. Since HLLE embedding is locally linear, it works well for the data lying on a manifold that may not be convex. Compared with other nonlinear DR methods, such as Isomaps that need the data set lying on a convex manifold, HLLE can be applied to data in a wider range. The chapter is organized as follows. In Section 13.1, we describe the Hessian locally linear embedding method and its mathematical background. In Sections 13.2, the HLLE DR algorithm is introduced. The experiments of the algorithm are included in Section 13.3.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Donoho, D.L., Grimes, C.: Hessian eigenmaps: New locally linear embedding techniques for high-dimensional data. Proc. Natl. Acad. Sci. USA 100, 5591–5596 (2003).
Belkin, M.: Problems of Learning on Manifolds. Ph.D. thesis, The University of Chicago (2003).
Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15(6), 1373–1396 (2003).
de Silva, V., Tenenbaum, J.B.: Global versus local methods in nonlinear dimensionality reduction. In: S. Becker, S. Thrun, K. Obermayer (eds.) Neural Information Processing Systems (NIPS 2002), pp. 705–712. MIT Press (2002).
Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000).
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000).
Saul, L.K., Roweis, S.T.: Think globally, fit locally: Unsupervised learning of low dimensional manifolds. Journal of Machine Learning Research 4, 119–155 (2003).
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2012 Higher Education Press, Beijing and Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Wang, J. (2012). Hessian Locally Linear Embedding. In: Geometric Structure of High-Dimensional Data and Dimensionality Reduction. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-27497-8_13
Download citation
DOI: https://doi.org/10.1007/978-3-642-27497-8_13
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-27496-1
Online ISBN: 978-3-642-27497-8
eBook Packages: Computer ScienceComputer Science (R0)