Abstract
Nowadays as the amount of available data grows, the problem of managing information becomes more difficult. In many applications data can be represented as a multidimensional array. However, in the big data case and as well as when we aim at discovering some structure in the data, we are often interested to construct some low-rank tensor approximations, for instance, using tensor train (TT) decomposition. If the original data is nonnegative, we may be interested to guarantee that an approximant keeps this property. Nonnegative tensor train factorization is an utterly nontrivial task when we cannot afford to see each data element because it may be too expensive in the case of big data.
A natural solution is to build tensor trains with all carriages (cores) to be nonnegative. This means that skeleton decompositions (approximations) have to be constructed nonnegative. Nonnegative factorizations can be used as models for recovering suitable structures in data, e.g., in machine learning and image processing tasks. In this work we suggest a new method for nonnegative tensor train factorizations, estimate its accuracy and give numerical results for different problems.
The work was supported by the Russian Science Foundation, grant 19-11-00338.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Lee, N., Phan, A.-H., Cong, F., Cichocki, A.: Nonnegative tensor train decompositions for multi-domain feature extraction and clustering. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds.) ICONIP 2016. LNCS, vol. 9949, pp. 87–95. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46675-0_10
Paatero, P., Tapper, U.: Positive matrix factorization: a non-negative factor model with optimal utilization of error estimates of data values. Environmetrics 5, 111–126 (1994)
Lee, D., Seung, H.: Learning the parts of objects by nonnegative matrix factorization. Nature 401, 788–791 (1999)
Vavasis, S.: On the complexity of nonnegative matrix factorization. SIAM J. Optim. 20(3), 1364–1377 (2007)
Cohen, J., Rothblum, U.: Nonnegative ranks, decompositions and factorizations of nonnegative matrices. Linear Algebra Appl. 190, 149–168 (1993)
Zhou, G., Cichocki, A., Xie, S.: Fast nonnegative matrix/tensor factorization based on low-rank approximation. IEEE Trans. Signal Process. 60(6), 2928–2940 (2012)
Goreinov, S., Tyrtyshnikov, E., Zamarashkin, N.: A theory of pseudo-skeleton approximations. Linear Algebra Appl. 261, 1–21 (1997)
Tyrtyshnikov, E.: Incomplete cross approximation in the mosaic-skeleton method. Computing 64(4), 367–380 (2000)
Goreinov, S., Tyrtyshnikov, E.: The maximal-volume concept in approximation by low-rank matrices. Contemp. Math. 208, 47–51 (2001)
Goreinov, S., Oseledets, I., Savostyanov, D., Tyrtyshnikov, E., Zamarashkin, N.: How to find a good submatrix. In: Olshevsky, V., Tyrtyshnikov, E. (eds.) Matrix Methods: Theory, Algorithms and Applications, pp. 247–256. World Scientific Publishers, New York (2010)
TT-Toolbox (TT=Tensor Train) by Oseledets, I. et al.: Version 2.2.2, Institute of Numerical Mathematics, Moscow, Russia. 2009–2013. https://github.com/oseledets/TT-Toolbox
TDALAB - MATLAB Toolbox for High-order Tensor Data Decompositions and Analysis. https://github.com/andrewssobral/TDALAB. Current version 1.1, released 1st May 2013
Cong, F., Lin, Q.-H., Kuang, L.-D., Gong, X.-F., Astikainen, P., Ristaniemi, T.: Tensor decomposition of EEG signals: a brief review. J. Neurosci. Methods 248, 59–69 (2015)
Cong, F., et al.: Benefits of multi-domain feature of mismatch negativity extracted by nonnegative tensor factorization from low-density array EEG. Int. J. Neural Syst. 22(6), 1250025 (2012)
Matveev, S., Zheltkov, D., Tyrtyshnikov, E., Smirnov, A.: Tensor train versus Monte Carlo for the multicomponent Smoluchowski coagulation equation. J. Comput. Phys. 316, 164–179 (2016)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Shcherbakova, E., Tyrtyshnikov, E. (2020). Nonnegative Tensor Train Factorizations and Some Applications. In: Lirkov, I., Margenov, S. (eds) Large-Scale Scientific Computing. LSSC 2019. Lecture Notes in Computer Science(), vol 11958. Springer, Cham. https://doi.org/10.1007/978-3-030-41032-2_17
Download citation
DOI: https://doi.org/10.1007/978-3-030-41032-2_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-41031-5
Online ISBN: 978-3-030-41032-2
eBook Packages: Computer ScienceComputer Science (R0)