Advertisement

Spectral Clustering Algorithm Based on Local Sparse Representation

  • Sen Wu
  • Min Quan
  • Xiaodong Feng
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8206)

Abstract

Clustering based on sparse representation is an important technique in machine learning and data mining fields. However, it is time-consuming because it constructs l 1-graph by solving l 1-minimization with all other samples as dictionary for each sample. This paper is focused on improving the efficiency of clustering based on sparse representation. Specifically, the Spectral Clustering Algorithm Based on Local Sparse Representation (SCAL) is proposed. For a given sample the algorithm solves l 1-minimization with the local \(\emph{k}\) nearest neighborhood as dictionary, constructs the similarity matrix by calculating sparsity induced similarity (SIS) of the sparse coefficients solution, and then uses spectral clustering with the similarity matrix to cluster the samples. Experiments using face recognition data sets ORL and Extended Yale B demonstrate that the proposed SCAL can get better clustering performance and less time consumption.

Keywords

Spectral Clustering Weight Matrix Sparse Representation k − nn 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Warren Liao, T.: Clustering of time series data-a survey. Pattern Recognition 38, 1857–1874 (2005)CrossRefMATHGoogle Scholar
  2. 2.
    Gao, Y., Choudhary, A., Hua, G.: A nonnegative sparsity induced similarity measure with application to cluster analysis of spam images. In: 2010 IEEE International Conference on Acoustics Speech and Signal Pro-cessing (ICASSP), pp. 5594–5597. IEEE (2010)Google Scholar
  3. 3.
    Yu, Z., Li, L., You, J., Wong, H., Han, G.: SC3: Triple Spectral Clustering-Based Consensus Clustering Framework for Class Discovery from Cancer Gene Expression Profiles. IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB) 9, 1751–1765 (2012)CrossRefGoogle Scholar
  4. 4.
    Wang, H., Chen, J., Huang, S., Guo, K.: A Heuristic Initialization-Independent Spectral Clustering. In: 2010 Fifth International Conference on Internet Computing for Science and Engineering (ICICSE), pp. 81–84. IEEE (2010)Google Scholar
  5. 5.
    Li, C., Guo, J., Zhang, H.: Local sparse representation based classification. In: 2010 20th International Conference on Pattern Recognition (ICPR), pp. 649–652. IEEE (2010)Google Scholar
  6. 6.
    Yan, S., Wang, H.: Semi-supervised Learning by Sparse Representation. In: 2009 SIAM Conference on Data Mining, pp. 792–801 (2009)Google Scholar
  7. 7.
    Elhamifar, E., Vidal, R.: Sparse subspace clustering. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2009), pp. 2790–2797. IEEE (2009)Google Scholar
  8. 8.
    Wright, J., Yang, A.Y., Ganesh, A., Sastry, S.S., Ma, Y.: Robust face recognition via sparse representation. IEEE Transactions on Pattern Analysis and Machine Intelligence 31, 210–227 (2009)CrossRefGoogle Scholar
  9. 9.
    Wright, J., Ma, Y., Mairal, J., Sapiro, G., Huang, T.S., Yan, S.: Sparse representation for computer vision and pattern recognition. P IEEE 98, 1031–1044 (2010)CrossRefGoogle Scholar
  10. 10.
    Jiao, J., Mo, X., Shen, C.: Image clustering via sparse representation. In: Boll, S., Tian, Q., Zhang, L., Zhang, Z., Chen, Y.-P.P. (eds.) MMM 2010. LNCS, vol. 5916, pp. 761–766. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  11. 11.
    Cheng, H., Liu, Z., Yang, J.: Sparsity induced similarity measure for label propagation. In: 2009 IEEE 12th International Conference on Image clustering via sparse representation, pp. 317–324. IEEE (2009)Google Scholar
  12. 12.
    Ng, A.Y., Jordan, M.I., Weiss, Y.: On spectral clustering: Analysis and an algorithm. In: Advances in Neural Information Processing Systems, vol. 2, pp. 849–856 (2002)Google Scholar
  13. 13.
    Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  14. 14.
    Yu, K., Zhang, T., Gong, Y.: Nonlinear learning using local coordinate coding. In: Advances in Neural Information Processing Systems, pp. 2223–2231 (2009)Google Scholar
  15. 15.
    Wang, F., Zhang, C.: Label propagation through linear neighborhoods. IEEE Transactions on Knowledge and Data Engineering 20, 55–67 (2008)CrossRefGoogle Scholar
  16. 16.
    Kim, S.J., Koh, K., Lustig, M., Boyd, S., Gorinevsky, D.: A method for large-scale l1-regularized least squares. IEEE Journal on Selected Topics in Signal Processing 4, 606–617 (2007)CrossRefGoogle Scholar
  17. 17.
    Huang, Z.: Extensions to the k-means algorithm for clustering large data sets with categorical values. Data Min. Knowl. Disc. 2, 283–304 (1998)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Sen Wu
    • 1
  • Min Quan
    • 1
  • Xiaodong Feng
    • 1
  1. 1.Dongling School of Economics and ManagementUniversity of Science and Technology BeijingBeijingP.R. China

Personalised recommendations