Skip to main content

HOOD: High-Order Orthogonal Decomposition for Tensors

  • Conference paper
  • First Online:
Smart Computing and Communication (SmartCom 2020)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12608))

Included in the following conference series:

Abstract

Tensor decompositions are becoming increasingly important in processing images and videos. Previous methods, such as ANDECOMP/PARAFAC decomposition (CPD), Tucker decomposition (TKD), or tensor train decomposition (TTD), treat individual modes (or coordinates) equally. Their results do not contain a natural and hierarchical connection between a given tensor and its lower-order slices (e.g., a video and its frames). To overcome the practical limitation of existing tensor decomposition methods, we propose an innovative High-Order Orthogonal Decomposition (HOOD) for arbitrary order tensors. HOOD decomposes a given tensor using orthogonal linear combinations of its lower-order slices. Each orthogonal linear combination will be further decomposed. In the end, it decomposes the given tensor into orthogonal rank-one tensors. For object detection and recognition tasks in high-resolution videos, HOOD demonstrated great advantages. It is about 100 times faster than CPD with similar accuracy detection and recognition results. It also demonstrated better accuracy than TKD with similar time overhead. HOOD can also be used to improve the explainability because the resulting eigenimages visually reveal the most important common properties of the videos and images, which is a unique feature that CPD, TKD, and TTD do not have.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Baskaran, M., et al.: Memory-efficient parallel tensor decompositions. In: 2017 IEEE High Performance Extreme Computing Conference (HPEC), pp. 1–7. IEEE (2017)

    Google Scholar 

  2. Carroll, J.D., Chang, J.J.: Analysis of individual differences in multidimensional scaling via an n-way generalization of “eckart-young” decomposition. Psychometrika 35(3), 283–319 (1970)

    Article  Google Scholar 

  3. Chen, M., Zhang, Y., Qiu, M., Guizani, N., Hao, Y.: Spha: smart personal health advisor based on deep analytics. IEEE Commun. Mag. 56(3), 164–169 (2018)

    Article  Google Scholar 

  4. Cho, S., Jun, T.J., Kang, M.: Applying tensor decomposition to image for robustness against adversarial attack. arXiv preprint arXiv:2002.12913 (2020)

  5. Dai, W., Qiu, L., Wu, A., Qiu, M.: Cloud infrastructure resource allocation for big data applications. IEEE Trans. Big Data 4(3), 313–324 (2016)

    Article  Google Scholar 

  6. De Lathauwer, L., De Moor, B., Vandewalle, J.: A multilinear singular value decomposition. SIAM J. Matrix Anal. Appl. 21(4), 1253–1278 (2000)

    Article  MathSciNet  Google Scholar 

  7. Drineas, P., Mahoney, M.W.: A randomized algorithm for a tensor-based generalization of the singular value decomposition. Linear Algebra Appl. 420(2–3), 553–571 (2007)

    Article  MathSciNet  Google Scholar 

  8. Huang, H., Liu, X., Zhang, T., Yang, B.: Regression PCA for moving objects separation. In: The 2020 IEEE Global Communications Conference (GLOBECOM 2020), Accepted. IEEE (2020)

    Google Scholar 

  9. Imaizumi, M., Hayashi, K.: Tensor decomposition with smoothness. In: Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 1597–1606, JMLR. org (2017)

    Google Scholar 

  10. Kolda, T.G.: Orthogonal tensor decompositions. SIAM J. Matrix Anal. Appl. 23(1), 243–255 (2001)

    Article  MathSciNet  Google Scholar 

  11. Kolda, T.G., Bader, B.W.: Tensor decompositions and applications. SIAM Rev. 51(3), 455–500 (2009)

    Article  MathSciNet  Google Scholar 

  12. Kolda, T.G., Sun, J.: Scalable tensor decompositions for multi-aspect data mining. In: Eighth IEEE International Conference on Data Mining, pp. 363–372 (2008)

    Google Scholar 

  13. Liu, X., Huang, H., Tang, W., Zhang, T., Yang, B.: Low-rank sparse tensor approximations for large high-resolution videos. In: 19th IEEE International Conference on Machine Learning and Applications (ICMLA 2020), Accepted. IEEE (2020)

    Google Scholar 

  14. Malik, O.A., Becker, S.: Low-rank tucker decomposition of large tensors using tensorsketch. In: Advances in Neural Information Processing Systems, pp. 10096–10106 (2018)

    Google Scholar 

  15. Oseledets, I.V.: Tensor-train decomposition. SIAM J. Sci. Comput. 33(5), 2295–2317 (2011)

    Article  MathSciNet  Google Scholar 

  16. Tang, X., Bi, X., Qu, A.: Individualized multilayer tensor learning with an application in imaging analysis. J. Am. Stat. Assoc. pp. 1–26 (2019)

    Google Scholar 

  17. Tucker, L.R.: Some mathematical notes on three-mode factor analysis. Psychometrika 31(3), 279–311 (1966)

    Article  MathSciNet  Google Scholar 

  18. Wang, Y., Tung, H.Y., Smola, A.J., Anandkumar, A.: Fast and guaranteed tensor decomposition via sketching. In: Advances in Neural Information Processing Systems, pp. 991–999 (2015)

    Google Scholar 

  19. Zhang, T.: CP decomposition and weighted clique problem. Stat. Probab. Lett. 161, 108723 (2020)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Tonglin Zhang or Baijian Yang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tang, W., Liu, X., Huang, H., Tang, Z., Zhang, T., Yang, B. (2021). HOOD: High-Order Orthogonal Decomposition for Tensors. In: Qiu, M. (eds) Smart Computing and Communication. SmartCom 2020. Lecture Notes in Computer Science(), vol 12608. Springer, Cham. https://doi.org/10.1007/978-3-030-74717-6_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-74717-6_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-74716-9

  • Online ISBN: 978-3-030-74717-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics