Abstract
The hierarchical tensor format allows for the low-parametric representation of tensors even in high dimensions d. On the one hand, this format provides a robust framework for approximate arithmetic operations with tensors based on rank truncations, which can be exploited in iterative algorithms. On the other hand, it can be used for the direct approximation of high-dimensional data stemming, e.g., from the discretisation of multivariate functions. In this review, we discuss several strategies for an adaptive approximation of tensors in the hierarchical format by black box type techniques, including problems of tensor reconstruction and tensor completion.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Acar, E., Dunlavy, D.M., Kolda, T.G., Mørup, M.: Scalable tensor factorizations for incomplete data. Chemom. Intell. Lab. Syst. 106(1), 41–56 (2011). doi:10.1016/j.chemolab.2010.08.004
Ballani, J.: Fast evaluation of near-field boundary integrals using tensor approximations. Ph.D. thesis, Universität Leipzig (2012)
Ballani, J.: Fast evaluation of singular BEM integrals based on tensor approximations. Numer. Math. 121(3), 433–460 (2012)
Ballani, J., Grasedyck, L.: A projection method to solve linear systems in tensor format. Numer. Linear Algebra Appl. 20(1), 27–43 (2013)
Ballani, J., Grasedyck, L.: Hierarchical tensor approximation of output quantities of parameter-dependent PDEs. Preprint 385, IGPM, RWTH-Aachen, www.igpm.rwth-aachen.de (2013)
Ballani, J., Grasedyck, L.: Tree adaptive approximation in the hierarchical tensor format. SIAM J. Sci. Comput. 36(4), A1415–A1431 (2014)
Ballani, J., Grasedyck, L., Kluge, M.: Black box approximation of tensors in hierarchical Tucker format. Linear Algebra Appl. 438(2), 639–657 (2013)
Ballani, J., Meszmer, P.: Tensor structured evaluation of singular volume integrals. Comput. Vis. Sci. 15(2), 75–86 (2012)
Bebendorf, M.: Approximation of boundary element matrices. Numer. Math. 86(4), 565–589 (2000)
Beylkin, G., Garcke, J., Mohlenkamp, M.: Multivariate regression and machine learning with sums of separable functions. SIAM J. Sci. Comput. 31, 1840–1857 (2009)
Caflisch, R.E.: Monte Carlo and quasi-Monte Carlo methods. Acta Numer. 7, 1–49 (1998)
Dauwels, J., Garg, L., Earnest, A., Pang, L.: Tensor factorizations for missing data imputation in medical questionnaires. In: ICASSP 2012, Kyoto (2012)
De Lathauwer, L., Moor, B.D., Vandewalle, J.: A multilinear singular value decomposition. SIAM J. Matrix Anal. Appl. 21(4), 1253–1278 (2000)
De Silva, V., Lim, L.H.: Tensor rank and the ill-posedness of the best low-rank approximation problem. SIAM J. Matrix Anal. Appl. 30, 1084–1127 (2008)
Espig, M.: Effiziente bestapproximation mittels summen von elementartensoren in hohen dimensionen. Ph.D. thesis, Universität Leipzig (2008)
Espig, M., Grasedyck, L., Hackbusch, W.: Black box low tensor-rank approximation using fiber-crosses. Constr. Approx. 30(3), 557–597 (2009). doi:10.1007/s00365-009-9076-9. http://www.mis.mpg.de/de/publications/preprints/2008/prepr2008-60.html
Goreinov, S.A., Tyrtyshnikov, E.E., Zamarashkin, N.L.: A theory of pseudoskeleton approximations. Linear Algebra Appl. 261, 1–22 (1997)
Grasedyck, L.: Hierarchical singular value decomposition of tensors. SIAM J. Matrix Anal. Appl. 31, 2029–2054 (2010)
Grasedyck, L., Hackbusch, W.: An introduction to hierarchical (\(\mathcal{H}\)-) rank and TT-rank of tensors with examples. Comput. Methods Appl. Math. 11(3), 291–304 (2011). http://www.cmam.info/issues/CMAMv11p291-304.pdf
Grasedyck, L., Kluge, M., Krämer, S.: Alternating directions fitting (ADF) of hierarchical low rank tensors. Preprint 149, DFG SPP-1324 (2013)
Grasedyck, L., Kressner, D., Tobler, C.: A literature survey of low-rank tensor approximation techniques. GAMM-Mitteilungen 36(1), 53–78 (2013)
Hackbusch, W.: Tensor Spaces and Numerical Tensor Calculus. Springer, Berlin (2012)
Hackbusch, W., Kühn, S.: A new scheme for the tensor representation. J. Fourier Anal. Appl. 15(5), 706–722 (2009). doi:10.1007/s00041-009-9094-9. http://www.mis.mpg.de/de/publications/preprints/2009/prepr2009-2.html
Håstad, J.: Tensor rank is NP-complete. J. Algorithms 11, 644–654 (1990)
Khoromskij, B.N., Oseledets, I.: Quantics-TT collocation approximation of parameter-dependent and stochastic elliptic PDEs. Comput. Methods Appl. Math. 10(4), 376–394 (2010)
Kluge, M.: Sampling rules for tensor reconstruction in hierarchical Tucker format. Preprint 162, DFG SPP-1324 (2014)
Kolda, T.G., Bader, B.W.: Tensor decompositions and applications. SIAM Rev. 51(3), 455–500 (2009). doi:10.1137/07070111X
Kressner, D., Steinlechner, M., Vandereycken, B.: Low-rank tensor completion by Riemannian optimization. Technical report, EPF Lausanne (2013). Technical report 19.2013
Kressner, D., Tobler, C.: Low-rank tensor Krylov subspace methods for parameterized linear systems. SIAM J. Matrix Anal. Appl. 32(4), 1288–1316 (2011)
Krishnamurthy, A., Singh, A.: Low-rank matrix and tensor completion via adaptive sampling (2013). ArXiv:1304.4672
Litvinenko, A., Matthies, H.G., El-Moselhy, T.A.: Low-rank tensor approximation of the response surface. accepted by MCQMC (2013)
Liu, Y., Shang, F.: An efficient matrix factorization method for tensor completions. IEEE Signal Process. Lett. 20(4), 307–310 (2013)
Oseledets, I.: Tensor-train decomposition. SIAM J. Sci. Comput. 33(5), 2295–2317 (2011)
Oseledets, I.V., Tyrtyshnikov, E.E.: TT-cross approximation for multidimensional arrays. Linear Algebra Appl. 432(1), 70–88 (2010)
Rauhut, H., Schneider, R., Stojanac, Z.: Low rank tensor recovery via iterative hard thresholding. In: SampTA 2013, 10th International Conference on Sampling Theory and Application, Jacobs University Bremen, Bremen (2013)
Savostyanov, D.: Quasioptimality of maximum-volume cross interpolation of tensors. Linear Algebra Appl. 458, 217–244 (2014)
Savostyanov, D.V., Oseledets, I.V.: Fast adaptive interpolation of multi-dimensional arrays in tensor train format. In: 7th International Workshop on Multidimensional Systems (nDS). IEEE, University of Poitiers, Poitiers (2011)
Signoretto, M., Tran Dinh, Q., De Lathauwer, L., Suykens, J.: Learning with tensors: a framework based on convex optimization and spectral regularization. Mach. Learn. 94(3), 303–351 (2014)
Silva, C.D., Herrmann, F.J.: Hierarchical Tucker tensor optimization – applications to tensor completion (2013). In: SampTA 2013, 10th International Conference on Sampling Theory and Application, Jacobs University Bremen, Bremen
Tobler, C.: Low-rank tensor methods for linear systems and eigenvalue problems. Ph.D. thesis, ETH Zürich (2012)
Tomasi, G., Bro, R.: PARAFAC and missing values. Chemom. Intell. Lab. 75(2), 163–180 (2005)
Tucker, L.R.: Some mathematical notes on three-mode factor analysis. Psychometrika 31, 279–311 (1966)
Vidal, G.: Efficient classical simulation of slightly entangled quantum computations. Phys. Rev. Lett. 91(14) (2003)
White, S.R.: Density matrix formulation for quantum renormalization groups. Phys. Rev. Lett. 69(19), 2863–2866 (1992)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Ballani, J., Grasedyck, L., Kluge, M. (2014). A Review on Adaptive Low-Rank Approximation Techniques in the Hierarchical Tensor Format. In: Dahlke, S., et al. Extraction of Quantifiable Information from Complex Systems. Lecture Notes in Computational Science and Engineering, vol 102. Springer, Cham. https://doi.org/10.1007/978-3-319-08159-5_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-08159-5_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-08158-8
Online ISBN: 978-3-319-08159-5
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)