Similarity-Based Sparse Feature Extraction Using Local Manifold Learning

  • Cheong Hee Park
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3918)


Feature extraction is an important preprocessing step which is encountered in many areas such as data mining, pattern recognition and scientific visualization. In this paper, a new method for sparse feature extraction using local manifold learning is proposed. Similarities in a neighborhood are first computed to explore local geometric structures, producing sparse feature representation. Based on the constructed similarity matrix, linear dimension reduction is applied to enhance similarities among the elements in the same class and extract optimal features for classification performances. Since it only computes similarities in a neighborhood, sparsity in the similarity matrix can give computational efficiency and memory savings. Experimental results demonstrate superior performances of the proposed method.


Linear Discriminant Analysis Similarity Matrix Nonlinear Dimensionality Reduction Sparse Feature Local Manifold 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification. Wiley-interscience, New York (2001)MATHGoogle Scholar
  2. 2.
    Jolliffe, I.T.: Principal Component Analysis. Springer-Verlag, New York (1986)CrossRefMATHGoogle Scholar
  3. 3.
    Cox, T., Cox, M.: Multidimensional scaling. Chapman & Hall, London (1994)MATHGoogle Scholar
  4. 4.
    Silva, V.d., Tenenbaum, J.B., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)CrossRefGoogle Scholar
  5. 5.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  6. 6.
    Golub, G.H., Van Loan, C.F.: Matrix Computations, 3rd edn. Johns Hopkins University Press, Baltimore (1996)MATHGoogle Scholar
  7. 7.
    Stewart, G.W.: Four algorithms for the efficient computation of truncated pivoted QR approximations to a sparse matrix. Numerische Mathematik 83, 313–323 (1999)MathSciNetCrossRefMATHGoogle Scholar
  8. 8.
    Paige, C.C., Saunders, M.A.: LSQR: An algorithm for sparse linear equations and sparse least squares. ACM transactions on mathematical software 8(1), 43–71 (1982)MathSciNetCrossRefMATHGoogle Scholar
  9. 9.
    Frey, P.W., Slate, D.J.: Letter recognition using holland-style adaptive classifiers. Machine learning 6, 161–182 (1991)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Cheong Hee Park
    • 1
  1. 1.Dept. of Computer Science and EngineeringChungnam National UniversityDaejeonKorea

Personalised recommendations