Missing Elements Recovery Using Low-Rank Tensor Completion and Total Variation Minimization
- 9 Downloads
The Low-rank (LR) and total variation (TV) are two most popular regularizations for image processing problems and have sparked a tremendous number of researches, particularly for moving from scalar to vector, matrix or even high-order based functions. However, discretization schemes commonly used for TV regularization often ignore the difference of the intrinsic properties, which is not effective enough to exploit the local smoothness, let alone the problem of edge blurring. To address this issue, in this paper, we consider the color image as three-dimensional tensors, then measure the smoothness of these tensors by TV norm along the different dimensions. The three-order tensor is then recovered by Tucker decomposition factorization. Specifically, we propose integrating Shannon total variation (STV) into low-rank tensor completion (LRTC). Moreover, due to the suboptimality of nuclear norm, we propose a new nonconvex low-rank constraint for closer rank approximation, namely truncated \(\gamma \)-norm. We solve the cost function using the alternating direction method of multipliers (ADMM) method. Experiments on color image inpainting tasks demonstrate that the proposed method enhances the details of the recovered images.
KeywordsTensor completion Low-rank Shannon total variation
This research is funded by Natural Science Foundation of China under Grant Nos. 61702275, 61976192, 61602413, 41775008, and by Zhejiang Provincial Natural Science Foundation of China under Grant Nos. LY18F020032 and LY19F030016.
- 2.Bertalmio, M., Sapiro, G., Caselles, V., Ballester, C.: Image inpainting. In: Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, pp. 417–424. ACM Press/Addison-Wesley Publishing Co. (2000)Google Scholar
- 3.Komodakis, N.: Image completion using global optimization. In: 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2006, vol. 1, pp. 442–452. IEEE (2006)Google Scholar
- 4.Candès, E.J., Tao, T.: The power of convex relaxation: near-optimal matrix completion. arXiv preprint arXiv:0903.1476
- 12.Shang, F., Liu, Y., Cheng, J.: Scalable algorithms for tractable Schatten quasi-norm minimization. In: Thirtieth AAAI Conference on Artificial Intelligence (2016)Google Scholar
- 16.Tomioka, R., Hayashi, K., Kashima, H.: Estimation of low-rank tensors via convex optimization. arXiv preprint arXiv:1010.0789
- 17.Li, X., Ye, Y., Xu, X.: Low-rank tensor completion with total variation for visual data inpainting. In: Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, vol. 419, pp. 2210–2216 (2017)Google Scholar
- 21.Xu, J., Zhang, L., Zhang, D., Feng, X.: Multi-channel weighted nuclear norm minimization for real color image denoising, pp. 1096–1104. CoRR (2017)Google Scholar
- 26.Li, X., Ye, Y., Xu, X.: Low-rank tensor completion with total variation for visual data inpainting. In: 2017 Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, pp. 2210–2216 (2017)Google Scholar