Skip to main content
Log in

The bounds of restricted isometry constants for low rank matrices recovery

  • Articles
  • Progress of Projects Supported by NSFC
  • Published:
Science China Mathematics Aims and scope Submit manuscript

Abstract

This paper discusses conditions under which the solution of linear system with minimal Schatten-p norm, 0 < p ⩽ 1, is also the lowest-rank solution of this linear system. To study this problem, an important tool is the restricted isometry constant (RIC). Some papers provided the upper bounds of RIC to guarantee that the nuclear-norm minimization stably recovers a low-rank matrix. For example, Fazel improved the upper bounds to δ A4r < 0.558 and δ A3r < 0.4721, respectively. Recently, the upper bounds of RIC can be improved to δ A2r < 0.307. In fact, by using some methods, the upper bounds of RIC can be improved to δ A2r < 0.4931 and δ A2r < 0.309. In this paper, we focus on the lower bounds of RIC, we show that there exists linear maps A with δ A2r > 1/√2 or δ A r > 1/3 for which nuclear norm recovery fail on some matrix with rank at most r. These results indicate that there is only a little limited room for improving the upper bounds for δ A2r and δ A r . Furthermore, we also discuss the upper bound of restricted isometry constant associated with linear maps A for Schatten p (0 < p < 1) quasi norm minimization problem.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Argyriou A, Micchelli C, Pontil M. Convex multi-task feature learning. Machine Learning, 2008, 73: 243–272

    Article  Google Scholar 

  2. Beck C, Andrea R. Computational study and comparisons of LFT reducibility methods. In: Proceedings of the American Control Conference. Michigan: American Automatic Control Council, 1998, 1013–1017

    Google Scholar 

  3. Cai J F, Candès E J, Shen Z W. A singular value thresholding algorithm for matrix completion. SIAM J Optim, 2010, 20: 1956–1982

    Article  MathSciNet  MATH  Google Scholar 

  4. Cai T T, Wang L, Xu G W. Shifting inequality and recovery of sparse signals. IEEE Trans Signal Process, 2010, 58: 1300–1308

    Article  MathSciNet  Google Scholar 

  5. Cai T T, Wang L, Xu G W. New bounds for restricted isometry constants. IEEE Trans Inform Theory, 2010, 56: 4388–4394

    Article  MathSciNet  Google Scholar 

  6. Candès E J. Compressive sampling. In: Proceedings of International Congress of Mathematicians, vol. 3. Madrid: European Mathematical Society Publishing House, 2006, 1433–1452

    Google Scholar 

  7. Candès E J. The restricted isometry property and its implications for compressed sensing. C R Acad Sci Paris Ser, 2008, 346: 589–592

    Article  MATH  Google Scholar 

  8. Candès E J, Plan Y. Tight oracle bounds for low-rank recovery from a minimal number of random measurements. IEEE Trans Inform Theory, 2009, 57: 2342–2359

    Article  Google Scholar 

  9. Candès E J, Recht B. Exact matrix completion via convex optimization. Found Comput Math, 2008, 9: 717–772

    Article  Google Scholar 

  10. Candès E J, Tao T. Decoding by linear programming. IEEE Trans Inform Theory, 2005, 51: 4203–4215

    Article  MathSciNet  MATH  Google Scholar 

  11. Candès E J, Tao T. The power of convex relaxation: Near-optimal matrix completion. IEEE Trans Inform Theory, 2009, 56: 2053–2080

    Article  Google Scholar 

  12. Chartrand R, Staneva V. Restricted isometry properties and nonconvex compressive sensing. Inverse Problems, 2008, 24: 1–14

    Article  MathSciNet  Google Scholar 

  13. Davies M E, Gribonval R. Restricted isometry constants where l p sparse recovery can fail for 0 < p ⩽ 1. IEEE Trans Inform Theory, 2009, 55: 2203–2214

    Article  MathSciNet  Google Scholar 

  14. Donoho D. Compressed sensing. IEEE Trans Inform Theory, 2006, 52: 1289–1306

    Article  MathSciNet  Google Scholar 

  15. Fazel M, Hindi H, Boyd B. A rank minimization heuristic with application to minimum order system approximation. In: Proceedings of the American Control Conference, vol. 6. Washington: IEEE, 2001, 4734–4739

    Google Scholar 

  16. Foucart S, Lai M J. Sparsest solutions of underdetermined linear systems via lq-minimization for 0 < q ⩽ 1. Appl Comput Harmon Anal, 2009, 26: 395–407

    Article  MathSciNet  MATH  Google Scholar 

  17. Horn R A, Johnson C R. Topics in Matrix Analysis. New York: Cambridge University Press, 1991

    Book  MATH  Google Scholar 

  18. Lin Z C, Chen M M, Wu L Q. The augmented lagrange multiplier method for exact recovery of corrupted low-rank matrices. Technical Report UILU-ENG-09-2215, UIUC, 2009

    Google Scholar 

  19. Ma S, Goldfarb D, Chen L. Fixed point and bregman iterative methods for matrix rank minimization. Math Program, 2011, 128: 321–353

    Article  MathSciNet  MATH  Google Scholar 

  20. Meka R, Jain P, Dhillon I S. Guaranteed rank minimization via singular value projection. Adv Neural Information Processing Systems, 2011, 23: 937–945

    Google Scholar 

  21. Mirsky L. A trace inequality of John von Neumann. Monatsh Math, 1975, 79: 303–306

    Article  MathSciNet  MATH  Google Scholar 

  22. Mo Q, Li S. New bounds on the restricted isometry constant δ 2r . Appl Comput Harmon Anal, 2011, 31: 460–468

    Article  MathSciNet  MATH  Google Scholar 

  23. Mohan K, Fazel M. New restricted isometry results for noisy low-rank recovery. In: IEEE International Symposium on Information Theory Proceedings. Seattle: IEEE, 2010, 1573–1577

    Google Scholar 

  24. Oymak S, Hassibi B. New null space results and recovery thresholds for matrix rank minization. Math Program, in press

  25. Oymak S, Mohan K, Fazel M, et al. A simplified approach to recovery conditions for low rank matrices. In: IEEE International Symposium on Information Theory Proceedings. Petersburg: Information Theory Society, 2011, 2318–2322

    Google Scholar 

  26. Recht B, Fazel M, Parrilo P. Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM Rev, 2009, 14: 1–46

    Google Scholar 

  27. Rennie J D, Srebro N. Fast maximum margin matrix factorization for collaborative prediction. In: Proceedings of the International Conference of Machine Learning. New York: Association Computing Machinery, 2005, 713–719

    Google Scholar 

  28. Shen Y, Li S. Restricted p-isometry property and its application for nonconvex compressive sensing. Adv Comput Math, 2012, 37: 441–452

    Article  MathSciNet  MATH  Google Scholar 

  29. Weinberger K Q, Saul L K. Unsupervised learning of image manifolds by semidefinite programming. Int J Computer Vision, 2006, 70: 77–90

    Article  Google Scholar 

  30. Yuan M, Ekici A, Lu Z, et al. Dimension reduction and coefficient estimation in multivariate linear regression. J Roy Statist Soc Ser B, 2007, 69: 329–346

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Song Li.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Wang, H., Li, S. The bounds of restricted isometry constants for low rank matrices recovery. Sci. China Math. 56, 1117–1127 (2013). https://doi.org/10.1007/s11425-013-4624-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11425-013-4624-y

Keywords

MSC(2010)

Navigation