Journal of Mathematical Imaging and Vision

, Volume 51, Issue 3, pp 361–377 | Cite as

Robust Principal Component Pursuit via Inexact Alternating Minimization on Matrix Manifolds

Article

Abstract

Robust principal component pursuit (RPCP) refers to a decomposition of a data matrix into a low-rank component and a sparse component. In this work, instead of invoking a convex-relaxation model based on the nuclear norm and the \(\ell ^1\)-norm as is typically done in this context, RPCP is solved by considering a least-squares problem subject to rank and cardinality constraints. An inexact alternating minimization scheme, with guaranteed global convergence, is employed to solve the resulting constrained minimization problem. In particular, the low-rank matrix subproblem is resolved inexactly by a tailored Riemannian optimization technique, which favorably avoids singular value decompositions in full dimension. For the overall method, a corresponding \(q\)-linear convergence theory is established. The numerical experiments show that the newly proposed method compares competitively with a popular convex-relaxation based approach.

Keywords

Matrix decomposition Low-rank matrix Sparse matrix Image processing Alternating minimization Riemannian manifold Optimization on manifolds 

Mathematics subject classification

15A83 53B21 65K10 90C30 94A08 

Notes

Acknowledgments

This research was supported by the Austrian Science Fund (FWF) through START project Y305 “Interfaces and Free Boundaries” and through SFB project F3204 “Mathematical Optimization and Applications in Biomedical Sciences”.

References

  1. 1.
    LMaFit: Low-rank matrix fitting. http://lmafit.blogs.rice.edu
  2. 2.
    Low-rank matrix recovery and completion via convex optimization. http://perception.csl.illinois.edu/matrix-rank/sample_code.html
  3. 3.
    PROPACK–Software for large and sparse SVD calculations. http://sun.stanford.edu/rmunk/PROPACK/
  4. 4.
    Statistical modeling of complex background for foreground object detection. http://perception.i2r.a-star.edu.sg/bk_model/bk_index.html
  5. 5.
    Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, New Jersey (2008)CrossRefMATHGoogle Scholar
  6. 6.
    Absil, P.-A., Malick, J.: Projection-like retractions on matrix manifolds. SIAM J. Optim. 22, 135–158 (2012)CrossRefMATHMathSciNetGoogle Scholar
  7. 7.
    Boumal, N., Absil, P.-A., RTRMC: A Riemannian trust-region method for low-rank matrix completion. In: Advances in Neural Information Processing Systems, pp. 406–414 (2011)Google Scholar
  8. 8.
    Boumal, N., Mishra, B., Absil, P.-A., Sepulchre, R.: Manopt, a Matlab toolbox for optimization on manifolds. J. Mach. Learn. Res. 15, 1455–1459 (2014). http://www.manopt.org
  9. 9.
    Cai, J.F., Candès, E.J., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20, 1956–1982 (2010)CrossRefMATHMathSciNetGoogle Scholar
  10. 10.
    Candès, E.J., Li, X., Ma, Y., Wright, J.: Robust principal component analysis? J. ACM 58, 1–37 (2011)CrossRefGoogle Scholar
  11. 11.
    Candès, E.J., Recht, B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9, 717–772 (2009)CrossRefMATHMathSciNetGoogle Scholar
  12. 12.
    Chandrasekaran, V., Sanghavi, S., Parrilo, P.A., Willsky, A.S.: Rank-sparsity incoherence for matrix decomposition. SIAM J. Optim. 21, 572–596 (2011)CrossRefMATHMathSciNetGoogle Scholar
  13. 13.
    Eckart, C., Young, G.: The approximation of one matrix by another of lower rank. Psychometrika 1, 211–218 (1936)CrossRefMATHGoogle Scholar
  14. 14.
    Edelman, A., Arias, T.A., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM J. Matrix Anal. Appl. 20, 303–353 (1998)CrossRefMATHMathSciNetGoogle Scholar
  15. 15.
    Huang, P.-S., Chen, S. D., Smaragdis, P., Hasegawa-Johnson, M.: Singing-voice separation from monaural recordings using robust principal component analysis. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 57–60 (2012)Google Scholar
  16. 16.
    Ji, H., Huang, S., Shen, Z., Xu, Y.: Robust video restoration by joint sparse and low rank matrix approximation. SIAM J. Imaging Sci. 4, 1122–1142 (2011)CrossRefMATHMathSciNetGoogle Scholar
  17. 17.
    Jia, K., Chan, T.-H., Ma, Y.: Robust and practical face recognition via structured sparsity. In: Proceedings of the 12th European Conference on Computer Vision, vol. 4, pp. 331–344 (2012)Google Scholar
  18. 18.
    Keshavan, R.H., Montanari, A., Oh, S.: Matrix completion from a few entries. IEEE Trans. Inf. Theory 56, 2980–2998 (2010)CrossRefMathSciNetGoogle Scholar
  19. 19.
    Keshavan, R.H., Montanari, A., Oh, S.: Matrix completion from noisy entries. J. Mach. Learn. Res. 11, 2057–2078 (2010)MATHMathSciNetGoogle Scholar
  20. 20.
    Knuth, D.: Sorting and Searching. The art of computer programming, vol. 3. Addison-Wesley, Reading (1997)MATHGoogle Scholar
  21. 21.
    Lewis, A.S., Malick, J.: Alternating projections on manifolds. Math. Oper. Res. 33, 216–234 (2008)CrossRefMATHMathSciNetGoogle Scholar
  22. 22.
    Li, L., Huang, W., Gu, I.Y.-H., Tian, Q.: Statistical modeling of complex backgrounds for foreground object detection. IEEE Trans. Image Process. 13, 1459–1472 (2004)CrossRefGoogle Scholar
  23. 23.
    Lin, Z., Chen, M., Wu, L., Ma, Y.: The augmented Lagrange multiplier method for exact recovery of corrupted low-rank matrices, Technical Report UILU-ENG-09-2215, UIUC, (2009)Google Scholar
  24. 24.
    Meyer, G., Bonnabel, S., Sepulchre, R.: Regression on fixed-rank positive semidefinite matrices: a Riemannian approach. J. Mach. Learn. Res. 12, 593–625 (2011)MATHMathSciNetGoogle Scholar
  25. 25.
    Min, K., Zhang, Z., Wright, J., Ma, Y.: Decomposing background topics from keywords by principal component pursuit. In: Proceedings of the 19th ACM international conference on Information and knowledge management, pp. 269–278 (2010)Google Scholar
  26. 26.
    Ngo, T.T., Saad, Y.: Scaled gradients on Grassmann manifolds for matrix completion. Adv. Neural Inf. Process. Syst. 25, 1421–1429 (2012)Google Scholar
  27. 27.
    Nocedal, J., Wright, S.: Numerical Optimization, 2nd edn. Springer, New York (2006)MATHGoogle Scholar
  28. 28.
    Seber, G.A.F.: Multivariate Observations. Wiley, Hoboken, NJ (1984)CrossRefMATHGoogle Scholar
  29. 29.
    Simonsson, L., Eldén, L.: Grassmann algorithms for low rank approximation of matrices with missing values. BIT Numer. Math. 50, 173–191 (2010)CrossRefMATHGoogle Scholar
  30. 30.
    Smith, S.T.: Geometric Optimization Methods for Adaptive Filtering, PhD thesis, Harvard University (1993)Google Scholar
  31. 31.
    Tao, M., Yuan, X.: Recovering low-rank and sparse components of matrices from incomplete and noisy observations. SIAM J. Optim. 21, 57–81 (2011)CrossRefMATHMathSciNetGoogle Scholar
  32. 32.
    Trefethen, L.N., Bau III, D.: Numerical Linear Algebra. SIAM, Philadelphia (1997)CrossRefMATHGoogle Scholar
  33. 33.
    Vandereycken, B.: Low-rank matrix completion by Riemannian optimization. SIAM J. Optim. 23, 1214–1236 (2013)CrossRefMATHMathSciNetGoogle Scholar
  34. 34.
    Wen, Z., Yin, W., Zhang, Y.: Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm. Math. Prog. Comput. 4, 333–361 (2012)CrossRefMATHMathSciNetGoogle Scholar
  35. 35.
    Zhang, Z., Ganesh, A., Liang, X., Ma, Y.: TILT: transform invariant low-rank textures. Int. J. Comput. Vis. 99, 1–24 (2012)CrossRefMATHMathSciNetGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Department of MathematicsHumboldt-Universität zu BerlinBerlinGermany
  2. 2.Institute for Mathematics and Scientific ComputingKarl-Franzens-University of GrazGrazAustria

Personalised recommendations