Skip to main content

Incremental Manifold Learning Algorithm Using PCA on Overlapping Local Neighborhoods for Dimensionality Reduction

  • Conference paper
Advances in Computation and Intelligence (ISICA 2008)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5370))

Included in the following conference series:

Abstract

A novel manifold learning algorithm called LPcaML is proposed in this paper. Based on the geometric intuition that d-dimensional manifold locally lies on or close to d-dimensional linear space, LPcaML first finds an α-TSLN of the whole high-dimensional input data set and then obtains the low-dimensional local coordinates of each neighborhood in the α-TSLN using classical PCA technique while preserving the local geometric and topological property of each neighborhood. At last LPcaML transforms each local coordinates to a unified global low-dimensional representation by processing each neighborhood in their order appeared in α-TSLN. And the transformation function of each neighborhood is obtained by solving a least square problem via the overlapped examples. By using the divide and conquer strategy, LPcaML can learn from incremental data and discover the underlying manifold efficiently even if the data set is large scale. Experiments on both synthetic data sets and real face data sets demonstrate the effectiveness of our LPcaML algorithm. Moreover the proposed LPcaML can discover the manifold from sparsely sampled data sets where other manifold learning algorithms can’t.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Donoho, D.: High-dimensional data analysis: the curse and blessings of dimensionality American Math Society on Math Challenges of the 21st century, Los Angeles (2000)

    Google Scholar 

  2. Jolliffe, I.T.: Principal Component Analysis. Springer, Heidelberg (1989)

    MATH  Google Scholar 

  3. Cox, T., Cox, M.: Multidimensional Scaling. Chapman and Hall, Boca Raton (1994)

    MATH  Google Scholar 

  4. Kohonen, T.: Self-Organizing Maps. Springer, Heidelberg (2001)

    Book  MATH  Google Scholar 

  5. Hastie, T., Stuetzle, W.: Principal Curves. J. Am. Statistical Assoc. 84, 502–516 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  6. Lin, T., Zha, H.: Riemannian Manifold Learning. IEEE Trans. Pattern Anal. Mach. Intell. 30(5), 796–809 (2008)

    Article  Google Scholar 

  7. Seung, H.S., Lee, D.D.: The manifold ways of perception. Science 290(22), 2268–2269 (2000)

    Article  Google Scholar 

  8. Tenenbaum, J.B., Silva, V.d., Langford, J.C.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290(22), 2319–2323 (2000)

    Article  Google Scholar 

  9. Roweis, S.T., Saul, L.K.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290(22), 2323–2326 (2000)

    Article  Google Scholar 

  10. Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)

    Article  MATH  Google Scholar 

  11. Zhang, Z., Zha, H.: Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment. SIAM J. Scientific Computing 26(1), 313–338 (2005)

    Article  MATH  Google Scholar 

  12. Dollár, P., Rabaud, V., Belongie, S.: Non-isometric manifold learning: analysis and an algorithm. In: ICML 2007, pp. 241–248 (2007)

    Google Scholar 

  13. Bengio, Y., Paiement, J.F., Vincent, P., et al.: Out-of-Sample Extensions for LLE, ISOMAP, MDS, Eigenmaps, and Spectral Clustering. Advances in Neural Information Processing Systems 16, 177–184 (2003)

    Google Scholar 

  14. Law, M.H.C., Jain, A.K.: Incremental Nonlinear Dimensionality Reduction by Manifold Learning. IEEE Trans. Pattern Anal. Mach. Intell. 28(3), 377–391 (2006)

    Article  Google Scholar 

  15. Saul, L.K., Roweis, S.T.: Think globally, fit locally: unsupervised learning of low dimensional manifolds. Journal of Machine Learning Research 4, 119–155 (2003)

    MathSciNet  MATH  Google Scholar 

  16. Donoho, D.L., Grimes, C.: Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proc. Natl. Acad. Sci. U. S. A. 100(10), 5591–5596 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  17. Wittman, T.: Mani Matlab Demo (2008), http://www.math.umn.edu/~wittman/mani/

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhan, Y., Yin, J., Zhang, G., Zhu, E. (2008). Incremental Manifold Learning Algorithm Using PCA on Overlapping Local Neighborhoods for Dimensionality Reduction. In: Kang, L., Cai, Z., Yan, X., Liu, Y. (eds) Advances in Computation and Intelligence. ISICA 2008. Lecture Notes in Computer Science, vol 5370. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-92137-0_45

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-92137-0_45

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-92136-3

  • Online ISBN: 978-3-540-92137-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics