Abstract
The research on the robust principal component analysis has been attracting much attention recently. Generally, the model assumes sparse noise and characterizes the error term by the λ1-norm. However, the sparse noise has clustering effect in practice so using a certain λ p -norm simply is not appropriate for modeling. In this paper, we propose a novel method based on sparse Bayesian learning principles and Markov random fields. The method is proved to be very effective for low-rank matrix recovery and contiguous outliers detection, by enforcing the low-rank constraint in a matrix factorization formulation and incorporating the contiguity prior as a sparsity constraint. The experiments on both synthetic data and some practical computer vision applications show that the novel method proposed in this paper is competitive when compared with other state-of-the-art methods.
Similar content being viewed by others
References
S. D. Babacan, M. Luessi, R. Molina, A. K. Katsaggelos: Sparse Bayesian methods for low-rank matrix estimation. IEEE Trans. Signal Process. 60 (2012), 3964–3977.
C. M. Bishop: Pattern Recognition and Machine Learning. Information Science and Statistics, Springer, New York, 2006.
Y. Boykov, O. Veksler, R. Zabih: Fast approximate energy minimization via graph cuts. IEEE Trans. Pattern Anal. Mach. Intell. 23 (2010), 1222–1239.
E. J. Candès, X. Li, Y. Ma, J. Wright: Robust principal component analysis? J. ACM 58 (2011), Article No. 11, 37 pages.
E. J. Candès, B. Recht: Exact matrix completion via convex optimization. Found. Comput. Math. 9 (2009), 717–772.
E. J. Candès, T. Tao: The power of convex relaxation: near-optimal matrix completion. IEEE Trans. Inf. Theory 56 (2010), 2053–2080.
F. De la Torre, M. J. Black: A framework for robust subspace learning. Int. J. Comput. Vis. 54 (2003), 117–142.
X. Ding, L. He, L. Carin: Bayesian robust principal component analysis. IEEE Trans. Image Process. 20 (2011), 3419–3430.
C. Ding, D. Zhou, X. He, H. Zha: R 1-PCA: rotational invariant L 1-norm principal component analysis for robust subspace factorization. Proc. 23rd Int. Conf. on Machine Learning. Pittsburgh. ACM, New York, 2006, pp. 281–288.
P. C. Hansen: Rank-Deficient and Discrete Ill-Posed Problems. Numerical Aspects of Linear Inversion. SIAM Monographs on Mathematical Modeling and Computation 4, Society for Industrial and Applied Mathematics, Philadelphia, 1998.
I. T. Jolliffe: Principal Component Analysis. Springer Series in Statistics, Springer, New York, 2002.
V. Kolmogorov, R. Zabin: What energy functions can be minimized via graph cuts? IEEE Trans. Pattern Anal. Mach. Intell. 26 (2004), 147–159.
N. Kwak: Principal component analysis based on L1-norm maximization. IEEE Trans. Pattern Anal. Mach. Intell. 30 (2008), 1672–1680.
S. Z. Li: Markov Random Field Modeling in Image Analysis. Advances in Pattern Recognition, Springer, London, 2009.
Z. Lin, M. Chen, Y. Ma: The augmented Lagrange multiplier method for exact recovery of corrupted low-rank matrices. ArXiv preprint arXiv:1009. 5055, 2010.
Z. Lin, A. Ganesh, J. Wright, L. Wu, M. Chen, Y. Ma: Fast convex optimization algorithms for exact recovery of a corrupted low-rank matrix. Computational Advances in Multi-Sensor Adaptive Processing. Aruba, Dutch Antilles, Proceedings. IEEE, 2009.
G. Liu, Z. Lin, Y. Yu: Robust subspace segmentation by low-rank representation. Proc. 27th Int. Conf. on Machine Learning, Haifa. Proceedings Omni Press., 2010, pp. 663–670.
R. Mazumder, T. Hastie, R. Tibshirani: Spectral regularization algorithms for learning large incomplete matrices. J. Mach. Learn. Res. 11 (2010), 2287–2322.
Y. Peng, A. Ganesh, J. Wright, W. Xu, Y. Ma: RASL: Robust alignment by sparse and low-rank decomposition for linearly correlated images. IEEE Trans. Pattern Anal. Mach. Intell. 34 (2012), 2233–2246.
B. Recht, M. Fazel, P. A. Parrilo: Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM Rev. 52 (2010), 471–501.
N. Wang, D. Y. Yeung: Bayesian robust matrix factorization for image and video processing. Computer Vision, 2013 IEEE International Conference, Sydney. Proceedings. IEEE, 2013, pp. 1785–1792.
J. Wright, A. Ganesh, S. Rao, Y. Peng, Y. Ma: Robust principal component analysis: Exact recovery of corrupted low-rank matrices via convex optimization. Advances in Neural Information Processing Systems 22, Vancouver. Proceedings. Curran Associates, 2009, pp. 2080–2088.
H. Xu, C. Caramanis, S. Sanghavi: Robust PCA via outlier pursuit. IEEE Trans. Inform. Theory 58 (2012), 3047–3064.
Q. Zhao, D. Meng, Z. Xu, W. Zuo, L. Zhang: Robust principal component analysis with complex noise. Proc. 31st Int. Conf. on Machine Learning, Beijing. Proceedings. J. Mach. Learn Res., 2014, pp. 55–63.
Z. Zhou, X. Li, J. Wright, E. J. Candès, Y. Ma: Stable principal component pursuit. 2010 IEEE International Symposium on Information Theory Proceedings, Austin. Proceedings. IEEE, 2010, pp. 1518–1522.
X. Zhou, C. Yang, W. Yu: Moving object detection by detecting contiguous outliers in the low-rank representation. IEEE Trans. Pattern Anal. Mach. Intell. 35 (2013), 597–610.
Author information
Authors and Affiliations
Corresponding author
Additional information
This work was partially supported by the National Natural Science Foundation of China (Grant No. 61379014) and Tianjin Research Program of Application Foundation and Advanced Technology (15JCYBJC21700).
Rights and permissions
About this article
Cite this article
Huan, G., Li, Y. & Song, Z. A novel robust principal component analysis method for image and video processing. Appl Math 61, 197–214 (2016). https://doi.org/10.1007/s10492-016-0128-8
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10492-016-0128-8
Keywords
- robust principal component analysis
- sparse Bayesian learning
- Markov random fields
- matrix factorization
- contiguity prior