Skip to main content

Efficient Tensor Low-Rank Representation with a Closed Form Solution

  • Conference paper
  • First Online:
Pattern Recognition (ACPR 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14407))

Included in the following conference series:

  • 318 Accesses

Abstract

In recent years, many tensor data processing methods have emerged. Tensor low-rank representation (TLRR) is a recently proposed tensor-based clustering method, and its clustering performance is promising. However, its calculation efficiency is low because its optimization procedure is iterative and needs to calculate tensor product, tensor singular value decomposition (t-SVD) and tensor product (t-product) in each iteration. To address the problem, we propose an efficient TLRR with a closed form solution (ETLRR/CFS). That is, we do not need an iterative procedure for finding the solution to ETLRR/CFS and only need one step to obtain the solution to ETLRR/CFS. Then, the computation efficiency is greatly improved. Specifically, we propose a novel objective function, which integrates tensor nuclear norm (TNN) and Frobenius norm into a unified framework, and give its closed form solution. Experiment results on several datasets shows that ETLRR/CFS not only is much faster than TLRR and its improved methods but can obtain similar clustering performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Elhamifar, E., Vidal, R.: Sparse subspace clustering: algorithm, theory, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 35, 2765–2781 (2013)

    Google Scholar 

  2. Basri, R., Jacobs, D.W.: Lambertian reflectance and linear subspaces. IEEE Trans. Pattern Anal. Mach. Intell. 25, 218–233 (2003)

    Google Scholar 

  3. Lu, C.Y., Min, H., Zhao, Z.Q., Zhu, L., Huang, D.S., Yan, S.: Robust and efficient subspace segmentation via least squares regression. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) Computer Vision – ECCV 2012. ECCV 2012. LNCS, vol. 7578, pp. 347–360. Springer, Berlin (2012). https://doi.org/10.1007/978-3-642-33786-4_26

  4. Hu, H., Lin, Z., Feng, J., Zhou, J.: Smooth representation clustering. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3834–3841 (2014)

    Google Scholar 

  5. Greene, D., Cunningham, P.: Practical solutions to the problem of diagonal dominance in kernel document clustering. In: Proceedings of the 23rd International Conference on Machine Learning, pp. 377–384 (2006)

    Google Scholar 

  6. Ma, Y., Derksen, H., Hong, W., Wright, J.: Segmentation of multivariate mixed data via lossy data coding and compression. IEEE Trans. Pattern Anal. Mach. Intell. 29, 1546–1562 (2007)

    Google Scholar 

  7. Du, S., Shi, Y., Shan, G., Wang, W., Ma, Y.: Tensor low-rank sparse representation for tensor subspace learning. Knowl.-Based Syst. 440, 351–364 (2021)

    Google Scholar 

  8. Vidal, E.E.R.: Sparse subspace clustering. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2790–2797 (2009)

    Google Scholar 

  9. Liu, G., Lin, Z., Yu, Y.: Robust subspace segmentation by low-rank representation. In: Proceedings of the 27th International Conference on Machine Learning (ICML-10), pp. 663–670 (2010)

    Google Scholar 

  10. Wright, J., Yang, A.Y., Ganesh, A., Sastry, S.S., Ma, Y.: Robust face recognition via sparse representation. IEEE Trans. Pattern Anal. Mach. Intell. 31, 210–227 (2018)

    Google Scholar 

  11. Candès, E.J., Li, X., Ma, Y., Wright, J.: Robust principal component analysis?. J. ACM 58, 1–37 (2011)

    Google Scholar 

  12. Du, S., Ma, Y., Ma, Y.: Graph regularized compact low rank representation for subspace clustering. Knowl.-Based Syst. 118, 56–69 (2017)

    Google Scholar 

  13. He, W., Chen, J.X., Zhang, W.: Low-rank representation with graph regularization for subspace clustering. Soft Comput. 21, 1569–1581 (2017). https://doi.org/10.1007/s00500-015-1869-0

  14. Wang, Q., He, X. and Li, X.: Locality and structure regularized low rank representation for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 57, 911–923 (2018)

    Google Scholar 

  15. Wen, J., Fang, X., Xu, Y., Tian, C., Fei, L.: Low-rank representation with adaptive graph regularization. Neural Netw. 108, 83–96 (2018)

    Google Scholar 

  16. Ding, Y., Chong, Y. and Pan, S.: Sparse and low-rank representation with key connectivity for hyperspectral image classification. IEEE J. Sel. Top. Appl. Earth Obser. Remote Sens. 13, 5609–5622 (2020)

    Google Scholar 

  17. Chen, J., Mao, H., Wang, Z., Zhang, X.: Low-rank representation with adaptive dictionary learning for subspace clustering. Knowl.-Based Syst. 223, 107053 (2021)

    Google Scholar 

  18. Cai, B., Lu, G.-F.: Tensor subspace clustering using consensus tensor low-rank representation. Inf. Sci. 609, 46–59 (2022)

    Google Scholar 

  19. Shi, J., Malik, J.: Normalized cuts and image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 22(8), 888–905 (2000)

    Google Scholar 

  20. Bertsekas, D.P.: Constrained Optimization and Lagrange Multiplier Methods. Academic Press, Cambridge (2014)

    Google Scholar 

  21. Favaro, P., Vidal, R., Ravichandran, A.: A closed form solution to robust subspace estimation and clustering. In: CVPR 2011, pp. 1801–1807. IEEE (2011)

    Google Scholar 

  22. Lu, C., Feng, J., Chen, Y., Liu, W., Lin, Z., Yan, S.: Tensor robust principal component analysis: exact recovery of corrupted low-rank tensors via convex optimization. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 5249–5257 (2016)

    Google Scholar 

  23. Yang, J., Luo, L., Qian, J., Tai, Y., Zhang, F., Xu, Y.: Nuclear norm based matrix regression with applications to face recognition with occlusion and illumination changes. IEEE Trans. Pattern Anal. Mach. Intell. 39, 56–171 (2016)

    Google Scholar 

  24. Lu, Y., Lai, Z., Li, X., Wong, W.K., Yuan, C., Zhang, D.: Low-rank 2-D neighborhood preserving projection for enhanced robust image representation. IEEE Trans. Cybern. 49, 1859–1872 (2018)

    Google Scholar 

  25. Kilmer, M.E., Martin, C.D.: Factorization strategies for third-order tensors. Linear Algebra Appl. 435, 641–658 (2011)

    Google Scholar 

  26. Kilmer, M.E., Braman, K., Hao, N., Hoover, R.C.: Third-order tensors as operators on matrices: A theoretical and computational framework with applications in imaging. SIAM J. Matrix Anal. Appl. 34, 148–172 (2013)

    Google Scholar 

  27. Kernfeld, E., Kilmer, M., Aeron, S.: Tensor–tensor products with invertible linear transforms. Linear Algebra Appl. 485, 545–570 (2015)

    Google Scholar 

  28. Zhou, P., Lu, C., Feng, J., Lin, Z., Yan, S.: Tensor low-rank representation for data recovery and clustering. IEEE Trans. Pattern Anal. Mach. Intell. 43(5) 1718–1732 (2019)

    Google Scholar 

  29. Du, S., Liu, B., Shan, G., Shi, Y., Wang, W.: Enhanced tensor low-rank representation for clustering and denoising. Knowl.-Based Syst. 243, 108468 (2022)

    Google Scholar 

Download references

Acknowledgments

This research was supported by NSFC (No. 61976005) and the Natural Science Research Project of Anhui Province University (No. 2022AH050970).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gui-Fu Lu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kan, Y., Lu, GF., Du, Y., Ji, G. (2023). Efficient Tensor Low-Rank Representation with a Closed Form Solution. In: Lu, H., Blumenstein, M., Cho, SB., Liu, CL., Yagi, Y., Kamiya, T. (eds) Pattern Recognition. ACPR 2023. Lecture Notes in Computer Science, vol 14407. Springer, Cham. https://doi.org/10.1007/978-3-031-47637-2_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-47637-2_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-47636-5

  • Online ISBN: 978-3-031-47637-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics