Abstract
A novel manifold learning algorithm called LPcaML is proposed in this paper. Based on the geometric intuition that d-dimensional manifold locally lies on or close to d-dimensional linear space, LPcaML first finds an α-TSLN of the whole high-dimensional input data set and then obtains the low-dimensional local coordinates of each neighborhood in the α-TSLN using classical PCA technique while preserving the local geometric and topological property of each neighborhood. At last LPcaML transforms each local coordinates to a unified global low-dimensional representation by processing each neighborhood in their order appeared in α-TSLN. And the transformation function of each neighborhood is obtained by solving a least square problem via the overlapped examples. By using the divide and conquer strategy, LPcaML can learn from incremental data and discover the underlying manifold efficiently even if the data set is large scale. Experiments on both synthetic data sets and real face data sets demonstrate the effectiveness of our LPcaML algorithm. Moreover the proposed LPcaML can discover the manifold from sparsely sampled data sets where other manifold learning algorithms can’t.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Donoho, D.: High-dimensional data analysis: the curse and blessings of dimensionality American Math Society on Math Challenges of the 21st century, Los Angeles (2000)
Jolliffe, I.T.: Principal Component Analysis. Springer, Heidelberg (1989)
Cox, T., Cox, M.: Multidimensional Scaling. Chapman and Hall, Boca Raton (1994)
Kohonen, T.: Self-Organizing Maps. Springer, Heidelberg (2001)
Hastie, T., Stuetzle, W.: Principal Curves. J. Am. Statistical Assoc. 84, 502–516 (1989)
Lin, T., Zha, H.: Riemannian Manifold Learning. IEEE Trans. Pattern Anal. Mach. Intell. 30(5), 796–809 (2008)
Seung, H.S., Lee, D.D.: The manifold ways of perception. Science 290(22), 2268–2269 (2000)
Tenenbaum, J.B., Silva, V.d., Langford, J.C.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290(22), 2319–2323 (2000)
Roweis, S.T., Saul, L.K.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290(22), 2323–2326 (2000)
Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)
Zhang, Z., Zha, H.: Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment. SIAM J. Scientific Computing 26(1), 313–338 (2005)
Dollár, P., Rabaud, V., Belongie, S.: Non-isometric manifold learning: analysis and an algorithm. In: ICML 2007, pp. 241–248 (2007)
Bengio, Y., Paiement, J.F., Vincent, P., et al.: Out-of-Sample Extensions for LLE, ISOMAP, MDS, Eigenmaps, and Spectral Clustering. Advances in Neural Information Processing Systems 16, 177–184 (2003)
Law, M.H.C., Jain, A.K.: Incremental Nonlinear Dimensionality Reduction by Manifold Learning. IEEE Trans. Pattern Anal. Mach. Intell. 28(3), 377–391 (2006)
Saul, L.K., Roweis, S.T.: Think globally, fit locally: unsupervised learning of low dimensional manifolds. Journal of Machine Learning Research 4, 119–155 (2003)
Donoho, D.L., Grimes, C.: Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proc. Natl. Acad. Sci. U. S. A. 100(10), 5591–5596 (2003)
Wittman, T.: Mani Matlab Demo (2008), http://www.math.umn.edu/~wittman/mani/
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhan, Y., Yin, J., Zhang, G., Zhu, E. (2008). Incremental Manifold Learning Algorithm Using PCA on Overlapping Local Neighborhoods for Dimensionality Reduction. In: Kang, L., Cai, Z., Yan, X., Liu, Y. (eds) Advances in Computation and Intelligence. ISICA 2008. Lecture Notes in Computer Science, vol 5370. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-92137-0_45
Download citation
DOI: https://doi.org/10.1007/978-3-540-92137-0_45
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-92136-3
Online ISBN: 978-3-540-92137-0
eBook Packages: Computer ScienceComputer Science (R0)