Unsupervised learning low-rank tensor from incomplete and grossly corrupted data
- 72 Downloads
Low-rank tensor completion and recovery have received considerable attention in the recent literature. The existing algorithms, however, are prone to suffer a failure when the multiway data are simultaneously contaminated by arbitrary outliers and missing values. In this paper, we study the unsupervised tensor learning problem, in which a low-rank tensor is recovered from an incomplete and grossly corrupted multidimensional array. We introduce a unified framework for this problem by using a simple equation to replace the linear projection operator constraint, and further reformulate it as two convex optimization problems through different approximations of the tensor rank. Two globally convergent algorithms, derived from the alternating direction augmented Lagrangian (ADAL) and linearized proximal ADAL methods, respectively, are proposed for solving these problems. Experimental results on synthetic and real-world data validate the effectiveness and superiority of our methods.
KeywordsUnsupervised learning Low-rank tensor Tensor recovery Convex optimization Alternating direction augmented Lagrangian (ADAL)
This work was supported by National Natural Science Foundation of China Nos. 61702023 and 91538204.
Compliance with ethical standards
Conflict of interest
The authors declare that they have no competing interests.
- 1.Acar E, Dunlavy DM, Kolda TG, Mørup M (2010) Scalable tensor factorizations with missing data. In: SIAM international conference on data mining, pp 701–712Google Scholar
- 10.Harshman RA (2009) Foundations of the PARAFAC procedure: model and conditions for an “explanatory” multi-mode factor analysis. UCLA Work Pap Phonetic 16:1–84Google Scholar
- 11.Huang B, Mu C, Goldfarb D, Wright J (2014) Provable low-rank tensor recovery. In: Optimization Online, p 4252Google Scholar
- 12.Kolda TG, Sun J (2008) Scalable tensor decompositions for multi-aspect data mining. In: 8th IEEE international conference on data mining, pp 363–372Google Scholar
- 15.Li Y, Yan J, Zhou Y, Yang J (2010) Optimum subspace learning and error correction for tensors. In: European conference on computer vision conference on computer vision, pp 790–803Google Scholar
- 16.Liu J, Musialski P, Wonka P, Ye J (2009) Tensor completion for estimating missing values in visual data. In: IEEE international conference on computer vision, pp 2114–2121Google Scholar
- 18.Mu C, Huang B, Wright J, Goldfarb D (2013) Square deal: lower bounds and improved relaxations for tensor recovery. In: International conference on machine learning, pp 73–81Google Scholar
- 20.Shang F, Liu Y, Cheng J, Cheng H (2014) Robust principal component analysis with missing data. In: Proceedings of the 23rd ACM international conference on information and knowledge management, Shanghai, China, pp 1149–1158Google Scholar
- 22.Signoretto M, Lathauwer LD, Suykens JAK (2010) Nuclear norms for tensors and their use for convex multilinear estimation. Technical Report 10-186, ESAT-SISTA, K. U. Leuven, Leuven, BelgiumGoogle Scholar
- 24.Tomioka R, Hayashi K, Kashima H (2010) Estimation of low-rank tensors via convex optimization. arXiv:1010.0789
- 25.Tomioka R, Suzuki T, Hayashi K, Kashima H (2011) Statistical performance of convex tensor decomposition. In: Advances in neural information processing systems, pp 972–980Google Scholar