Skip to main content
Log in

Rank–sparsity balanced representation for subspace clustering

  • Special Issue Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

Subspace learning has many applications such as motion segmentation and image recognition. The existing algorithms based on self-expressiveness of samples for subspace learning may suffer from the unsuitable balance between the rank and sparsity of the expressive matrix. In this paper, a new model is proposed that can balance the rank and sparsity well. This model adopts the log-determinant function to control the rank of solution. Meanwhile, the diagonals are penalized, rather than the strict zero-restriction on diagonals. This strategy makes the rank–sparsity balance more tunable. We furthermore give a new graph construction from the low-rank and sparse solution, which absorbs the advantages of the graph constructions in the sparse subspace clustering and the low-rank representation for further clustering. Numerical experiments show that the new method, named as RSBR, can significantly increase the accuracy of subspace clustering on the real-world data sets that we tested.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. We say \(\mathbf{C}_k\) is connected if a undirected graph having the adjoint matrix \(|\mathbf{C}_k|+|\mathbf{C}_k^T|\) is connected.

  2. http://vision.jhu.edu/code/fetchcode.php?id=4.

  3. http://www.vision.jhu.edu/data/hopkins155.

  4. http://www.cs.columbia.edu/CAVE/software/softlib/coil-20.php.

  5. http://www.escience.cn/people/fpnie/papers.html.

  6. In this case, \(\mathbf{C}\) should be a block-diagonal matrix of K connected blocks under a permutation.

References

  1. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  2. Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3(1), 1–122 (2011)

    Article  MATH  Google Scholar 

  3. Bradley, P.S., Mangasarian, O.L.: k-plane clustering. J. Glob. Optim. 16(1), 23–32 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  4. Elhamifar, E.: High-rank matrix completion and clustering under self-expressive models. In: Lee, D.D., Sugiyama, M., Luxburg, U.V., Guyon, I., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 29, pp. 73–81. Curran Associates, Inc. (2016)

  5. Elhamifar, E., Vidal, R.: Sparse subspace clustering. In: Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on, pp. 2790–2797 (2009)

  6. Elhamifar, E., Vidal, R.: Clustering disjoint subspaces via sparse representation. In: IEEE International Conference on Acoustics Speech and Signal Processing, pp. 1926–1929 (2010)

  7. Elhamifar, E., Vidal, R.: Sparse subspace clustering: algorithm, theory, and applications. IEEE Trans. Softw. Eng. 35(11), 2765–2781 (2013)

    Google Scholar 

  8. Han, J., Xiong, K., Nie, F.: Orthogonal and nonnegative graph reconstruction for large scale clustering. In: Twenty-Sixth International Joint Conference on Artificial Intelligence, pp. 1809–1815 (2017)

  9. Hurley, N., Rickard, S.: Comparing measures of sparsity. IEEE Trans. Inf. Theory 55(10), 4723–4741 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  10. Kang, Z., Peng, C., Cheng, Q.: Robust subspace clustering via smoothed rank approximation. IEEE Signal Process. Lett. 22(11), 2088–2092 (2015)

    Article  Google Scholar 

  11. Lee, K.C., Ho, J., Kriegman, D.J.: Acquiring linear subspaces for face recognition under variable lighting. IEEE Trans. Pattern Anal. Mach. Intell. 27(5), 684–698 (2005). https://doi.org/10.1109/TPAMI.2005.92

    Article  Google Scholar 

  12. Li, C.G., Vidal, R.: Structured sparse subspace clustering: a unified optimization framework. In: Computer Vision and Pattern Recognition, pp. 277–286 (2015)

  13. Li, C.G., Vidal, R.: A structured sparse plus structured low-rank framework for subspace clustering and completion. IEEE Trans. Signal Process. 64(24), 6557–6570 (2016)

    Article  MathSciNet  Google Scholar 

  14. Liu, G., Lin, Z., Yan, S., Sun, J., Yu, Y., Ma, Y.: Robust recovery of subspace structures by low-rank representation. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 171–184 (2013). https://doi.org/10.1109/TPAMI.2012.88

    Article  Google Scholar 

  15. Liu, G., Lin, Z., Yu, Y.: Robust subspace segmentation by low-rank representation. In: International Conference on Machine Learning, pp. 663–670 (2010)

  16. Nasihatkon, B., Hartley, R.: Graph connectivity in sparse subspace clustering. In: Computer Vision and Pattern Recognition, pp. 2137–2144 (2011)

  17. Nene, S.A., Nayar, S.K., Murase, H.: Columbia Object Image Library (COIL-20). Technical Report CUCS-005-96 (1996)

  18. Nie, F., Huang, H.: Subspace clustering via new low-rank model with discrete group structure constraint. In: IJCAI International Joint Conference on Artificial Intelligence 2016-January, pp. 1874–1880 (2016)

  19. Oh, T.H., Matsushita, Y., Tai, Y.W., Kweon, I.S.: Fast randomized singular value thresholding for nuclear norm minimization. In: Computer Vision and Pattern Recognition, pp. 4484–4493 (2015)

  20. Park, D., Caramanis, C., Sanghavi, S.: Greedy subspace clustering. In: International Conference on Neural Information Processing Systems, pp. 2753–2761 (2014)

  21. Rahmani, M., Atia, G.K.: Innovation pursuit: a new approach to subspace clustering. IEEE Trans. Signal Process. 65(23), 6276–6291 (2017)

    Article  MathSciNet  Google Scholar 

  22. Soltanolkotabi, M., Cands, E.J.: A geometric analysis of subspace clustering with outliers. Ann. Stat. 40(4), 2012 (2012)

    Article  MathSciNet  Google Scholar 

  23. Soltanolkotabi, M., Elhamifar, E., Cands, E.J.: Robust subspace clustering. Ann. Stat. 42(2), 669–699 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  24. Tron, R., Vidal, R.: A benchmark for the comparison of 3D motion segmentation algorithms. In: Computer Vision and Pattern Recognition, 2007. CVPR ’07. IEEE Conference on, pp. 1–8 (2007)

  25. Tseng, P.: Nearest q-flat to m points. J. Optim. Theory Appl. 105(1), 249–252 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  26. Vidal, R., Ma, Y., Sastry, S.: Generalized principal component analysis (GPCA). IEEE Trans. Pattern Anal. Mach. Intell. 27(12), 1945–1959 (2005)

    Article  Google Scholar 

  27. Von Luxburg, U.: A tutorial on spectral clustering. Stat. Comput. 17(4), 395–416 (2007)

    Article  MathSciNet  Google Scholar 

  28. Wang, J., Shi, D., Cheng, D., Zhang, Y., Gao, J.: LRSR: low-rank-sparse representation for subspace clustering. Neurocomputing 214, 1026–1037 (2016)

    Article  Google Scholar 

  29. Wang, Y.X., Xu, H.: Noisy sparse subspace clustering J. Mach. Learn. Res. (2013)

  30. Wang, Y.X., Xu, H., Leng, C.: Provable subspace clustering: when LRR meets SSC. In: International Conference on Neural Information Processing Systems, pp 64–72 (2013)

  31. Wang, Y., Yin, W., Zeng, J.: Global convergence of ADMM in nonconvex nonsmooth optimization. UCLA CAM Report 15–62 (2015)

  32. Wei, S., Lin, Z.: Analysis and improvement of low rank representation for subspace segmentation. Computer Vision and Pattern Recognition. arXiv:1107.1561v1 (2011)

  33. Wu, F., Zhou, Y., Yang, Y., Tang, S., Zhang, Y., Zhuang, Y.: Sparse multi-modal hashing. IEEE Trans. Multimed. 16(2), 427–439 (2014)

    Article  Google Scholar 

  34. Xu, Y., Fang, X., Wu, J., Li, X., Zhang, D.: Discriminative transfer subspace learning via low-rank and sparse representation. IEEE Trans. Image Process. A Publ. IEEE Signal Process. Soc. 25(2), 850 (2016)

    Article  MathSciNet  Google Scholar 

  35. Yan, S., Xu, D., Zhang, B., Zhang, H.J., Yang, Q., Lin, S.: Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Trans. Pattern Anal. Mach. Intell. 29(1), 40 (2007)

    Article  Google Scholar 

  36. Yang, J., Yin, W., Zhang, Y., Wang, Y.: A fast algorithm for edge-preserving variational multichannel image restoration. SIAM J. Imaging Sci. 2(2), 569–592 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  37. Yang, Y., Feng, J., Jojic, N., Yang, J., Huang, T.S.: \(\ell ^{0}\) -sparse subspace clustering. In: European Conference on Computer Vision, pp. 731–747 (2016)

  38. Yu, J., Hong, C., Rui, Y., Tao, D.: Multitask autoencoder model for recovering human poses. IEEE Trans Ind Electron 65(6), 5060–5068 (2018)

    Article  Google Scholar 

  39. Yu, Z., Wu, F., Yang, Y., Tian, Q., Luo, J., Zhuang, Y.: Discriminative coupled dictionary hashing for fast cross-media retrieval. In: International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 395–404 (2014)

Download references

Acknowledgements

The work was supported in part by NSFC Projects 11571312 and 91730303, and National Basic Research Program of China (973 Program) 2015CB352503.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhenyue Zhang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xia, Y., Zhang, Z. Rank–sparsity balanced representation for subspace clustering. Machine Vision and Applications 29, 979–990 (2018). https://doi.org/10.1007/s00138-018-0918-y

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-018-0918-y

Keywords

Navigation