Skip to main content

A High-Order Tensor Completion Algorithm Based on Fully-Connected Tensor Network Weighted Optimization

  • Conference paper
  • First Online:
Pattern Recognition and Computer Vision (PRCV 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13534))

Included in the following conference series:

Abstract

Tensor completion aims at recovering missing data, and it is one of the popular concerns in deep learning and signal processing. Among the higher-order tensor decomposition algorithms, the recently proposed fully-connected tensor network decomposition (FCTN) algorithm is the most advanced. In this paper, by leveraging the superior expression of the fully-connected tensor network (FCTN) decomposition, we propose a new tensor completion method named the fully connected tensor network weighted optimization (FCTN-WOPT). The algorithm performs a composition of the completed tensor by initializing the factors from the FCTN decomposition. We build a loss function with the weight tensor, the completed tensor and the incomplete tensor together, and then update the completed tensor using the lbfgs gradient descent algorithm to reduce the spatial memory occupation and speed up iterations. Finally we test the completion with synthetic data and real data (both image data and video data) and the results show the advanced performance of our FCTN-WOPT when it is applied to higher-order tensor completion.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    The data is available at http://openremotesensing.net/kb/data/.

  2. 2.

    The data is available at http://trace.eas.asu.edu/yuv/.

  3. 3.

    Homepage: http://gtl.inrialpes.fr/.

References

  1. Kolda, T.G., Bader, B.W.: Tensor decompositions and applications. SIAM Rev. 51(3), 455–500 (2009)

    Article  MathSciNet  Google Scholar 

  2. Oseledets, I.V.: Tensor-train decomposition. SIAM J. Sci. Comput. 33(5), 2295–2317 (2011)

    Article  MathSciNet  Google Scholar 

  3. Zhao, Q., Zhou, G., Xie, S., et al.: Tensor ring decomposition. arXiv preprint arXiv:1606.05535 (2016)

  4. Song, Q., Ge, H., Caverlee, J., Hu, X.: Tensor completion algorithms in big data analytics. ACM Trans. Knowl. Discovery Data 13, 1–48 (2019)

    Article  Google Scholar 

  5. Bazerque, J.A., Mateos, G., Giannakis, G.B.: Rank regularization and Bayesian inference for tensor completion and extrapolation. IEEE Trans. Signal Process. 61(22), 5689–5703 (2013)

    Article  MathSciNet  Google Scholar 

  6. Ding, M., Huang, T.-Z., Ji, T.-Y., Zhao, X.-L., Yang, J.-H.: Low-rank tensor completion using matrix factorization based on tensor train rank and total variation. J. Sci. Comput. 81(2), 941–964 (2019). https://doi.org/10.1007/s10915-019-01044-8

    Article  MathSciNet  MATH  Google Scholar 

  7. Gandy, S., Recht, B., Yamada, I.: Tensor completion and low-n-rank tensor recovery via convex optimization. Inverse Prob. 27(2), 025010 (2011)

    Article  MathSciNet  Google Scholar 

  8. Yu, D., Deng, L., Seide, F.: The deep tensor neural network with applications to large vocabulary speech recognition. IEEE Trans. Audio Speech Lang. Process. 21(2), 388–396 (2012)

    Article  Google Scholar 

  9. Mahyari, A.G., Zoltowski, D.M., Bernat, E.M., et al.: A tensor decomposition-based approach for detecting dynamic network states from EEG. IEEE Trans. Biomed. Eng. 64(1), 225–237 (2016)

    Article  Google Scholar 

  10. Guo, X., Huang, X., Zhang, L., et al.: Support tensor machines for classification of hyperspectral remote sensing imagery. IEEE Trans. Geosci. Remote Sens. 54(6), 3248–3264 (2016)

    Article  Google Scholar 

  11. Zheng, Y.B., Huang, T.Z., Zhao, X.L., et al.: Fully-connected tensor network decomposition and its application to higher-order tensor completion. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35(12), pp. 11071–11078 (2021)

    Google Scholar 

  12. Hu, W., Tao, D., Zhang, W., et al.: The twist tensor nuclear norm for video completion. IEEE Trans. Neural Networks Learn. Syst. 28(12), 2961–2973 (2016)

    Article  MathSciNet  Google Scholar 

  13. Yuan, M., Zhang, C.H.: On tensor completion via nuclear norm minimization. Found. Comput. Math. 16(4), 1031–1068 (2016)

    Article  MathSciNet  Google Scholar 

  14. Yu, J., Li, C., Zhao, Q., et al.: Tensor-ring nuclear norm minimization and application for visual: data completion. In: ICASSP 2019–2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 3142–3146. IEEE (2019)

    Google Scholar 

  15. Yuan, L., Cao, J., Zhao, X., et al.: Higher-dimension tensor completion via low-rank tensor ring decomposition. In: 2018 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), pp. 1071–1076. IEEE (2018)

    Google Scholar 

  16. Liu, Y.Y., Zhao, X.L., Song, G.J., et al.: Fully-connected tensor network decomposition for robust tensor completion problem. arXiv preprint arXiv:2110.08754 (2021)

  17. Ahad, A., Long, Z., Zhu, C., et al.: Hierarchical tensor ring completion. arXiv preprint arXiv:2004.11720 (2020)

  18. Acar, E., Dunlavy, D.M., Kolda, T.G., Mørup, M.: Scalable tensor factorizations for incomplete data. Chemom. Intell. Lab. Syst. 106(1), 41–56 (2011)

    Article  Google Scholar 

  19. Yuan, L., Li, C., Mandic, D., et al.: Tensor ring decomposition with rank minimization on latent space: an efficient approach for tensor completion. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, no. 01, pp. 9151–9158 (2019)

    Google Scholar 

  20. Wang, W., Aggarwal, V., Aeron, S.: Efficient low rank tensor ring completion. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 5697–5705 (2017)

    Google Scholar 

  21. Chen, Y., He, W., Yokoya, N., et al.: Nonlocal tensor-ring decomposition for hyperspectral image denoising. IEEE Trans. Geosci. Remote Sens. 58(2), 1348–1362 (2019)

    Article  Google Scholar 

  22. He, W., Chen, Y., Yokoya, N., et al.: Hyperspectral super-resolution via coupled tensor ring factorization. Pattern Recogn. 122, 108280 (2022)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yonghui Huang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yang, P., Huang, Y., Qiu, Y., Sun, W., Zhou, G. (2022). A High-Order Tensor Completion Algorithm Based on Fully-Connected Tensor Network Weighted Optimization. In: Yu, S., et al. Pattern Recognition and Computer Vision. PRCV 2022. Lecture Notes in Computer Science, vol 13534. Springer, Cham. https://doi.org/10.1007/978-3-031-18907-4_32

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-18907-4_32

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-18906-7

  • Online ISBN: 978-3-031-18907-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics