Skip to main content

Nonnegative Tensor Train Factorizations and Some Applications

  • Conference paper
  • First Online:
Large-Scale Scientific Computing (LSSC 2019)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11958))

Included in the following conference series:

Abstract

Nowadays as the amount of available data grows, the problem of managing information becomes more difficult. In many applications data can be represented as a multidimensional array. However, in the big data case and as well as when we aim at discovering some structure in the data, we are often interested to construct some low-rank tensor approximations, for instance, using tensor train (TT) decomposition. If the original data is nonnegative, we may be interested to guarantee that an approximant keeps this property. Nonnegative tensor train factorization is an utterly nontrivial task when we cannot afford to see each data element because it may be too expensive in the case of big data.

A natural solution is to build tensor trains with all carriages (cores) to be nonnegative. This means that skeleton decompositions (approximations) have to be constructed nonnegative. Nonnegative factorizations can be used as models for recovering suitable structures in data, e.g., in machine learning and image processing tasks. In this work we suggest a new method for nonnegative tensor train factorizations, estimate its accuracy and give numerical results for different problems.

The work was supported by the Russian Science Foundation, grant 19-11-00338.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Lee, N., Phan, A.-H., Cong, F., Cichocki, A.: Nonnegative tensor train decompositions for multi-domain feature extraction and clustering. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds.) ICONIP 2016. LNCS, vol. 9949, pp. 87–95. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46675-0_10

    Chapter  Google Scholar 

  2. Paatero, P., Tapper, U.: Positive matrix factorization: a non-negative factor model with optimal utilization of error estimates of data values. Environmetrics 5, 111–126 (1994)

    Article  Google Scholar 

  3. Lee, D., Seung, H.: Learning the parts of objects by nonnegative matrix factorization. Nature 401, 788–791 (1999)

    Article  Google Scholar 

  4. Vavasis, S.: On the complexity of nonnegative matrix factorization. SIAM J. Optim. 20(3), 1364–1377 (2007)

    Article  MathSciNet  Google Scholar 

  5. Cohen, J., Rothblum, U.: Nonnegative ranks, decompositions and factorizations of nonnegative matrices. Linear Algebra Appl. 190, 149–168 (1993)

    Article  MathSciNet  Google Scholar 

  6. Zhou, G., Cichocki, A., Xie, S.: Fast nonnegative matrix/tensor factorization based on low-rank approximation. IEEE Trans. Signal Process. 60(6), 2928–2940 (2012)

    Article  MathSciNet  Google Scholar 

  7. Goreinov, S., Tyrtyshnikov, E., Zamarashkin, N.: A theory of pseudo-skeleton approximations. Linear Algebra Appl. 261, 1–21 (1997)

    Article  MathSciNet  Google Scholar 

  8. Tyrtyshnikov, E.: Incomplete cross approximation in the mosaic-skeleton method. Computing 64(4), 367–380 (2000)

    Article  MathSciNet  Google Scholar 

  9. Goreinov, S., Tyrtyshnikov, E.: The maximal-volume concept in approximation by low-rank matrices. Contemp. Math. 208, 47–51 (2001)

    Article  MathSciNet  Google Scholar 

  10. Goreinov, S., Oseledets, I., Savostyanov, D., Tyrtyshnikov, E., Zamarashkin, N.: How to find a good submatrix. In: Olshevsky, V., Tyrtyshnikov, E. (eds.) Matrix Methods: Theory, Algorithms and Applications, pp. 247–256. World Scientific Publishers, New York (2010)

    Chapter  Google Scholar 

  11. TT-Toolbox (TT=Tensor Train) by Oseledets, I. et al.: Version 2.2.2, Institute of Numerical Mathematics, Moscow, Russia. 2009–2013. https://github.com/oseledets/TT-Toolbox

  12. TDALAB - MATLAB Toolbox for High-order Tensor Data Decompositions and Analysis. https://github.com/andrewssobral/TDALAB. Current version 1.1, released 1st May 2013

  13. Cong, F., Lin, Q.-H., Kuang, L.-D., Gong, X.-F., Astikainen, P., Ristaniemi, T.: Tensor decomposition of EEG signals: a brief review. J. Neurosci. Methods 248, 59–69 (2015)

    Article  Google Scholar 

  14. Cong, F., et al.: Benefits of multi-domain feature of mismatch negativity extracted by nonnegative tensor factorization from low-density array EEG. Int. J. Neural Syst. 22(6), 1250025 (2012)

    Article  Google Scholar 

  15. Matveev, S., Zheltkov, D., Tyrtyshnikov, E., Smirnov, A.: Tensor train versus Monte Carlo for the multicomponent Smoluchowski coagulation equation. J. Comput. Phys. 316, 164–179 (2016)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Elena Shcherbakova .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Shcherbakova, E., Tyrtyshnikov, E. (2020). Nonnegative Tensor Train Factorizations and Some Applications. In: Lirkov, I., Margenov, S. (eds) Large-Scale Scientific Computing. LSSC 2019. Lecture Notes in Computer Science(), vol 11958. Springer, Cham. https://doi.org/10.1007/978-3-030-41032-2_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-41032-2_17

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-41031-5

  • Online ISBN: 978-3-030-41032-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics