Advertisement

Sparse Recovery with Random Matrices

  • Simon Foucart
  • Holger Rauhut
Chapter
Part of the Applied and Numerical Harmonic Analysis book series (ANHA)

Abstract

In this chapter, the restricted isometry property, which guarantees the uniform recovery of sparse vectors via a variety of methods, is proved to hold with high probability for subgaussian random matrices provided the number of rows (i.e., measurements) scales like the sparsity times a logarithmic factor. For Gaussian matrices, precise estimates for the required number of measurements (including optimal or at least small values of the constants) are given both in the setting of nonuniform recovery and of uniform recovery. In the latter case, this is first done via an estimate of the restricted isometry constants and then directly through the null space property. Finally, a close relation between the restricted isometry property and the Johnson–Lindenstrauss lemma is uncovered.

Keywords

isotropic subgaussian vectors subgaussian matrices concentration inequality restricted isometry property universality uniform recovery 1-minimization null space property nonuniform recovery Gaussian width Gordon’s escape through the mesh Johnson–Lindenstrauss lemma 

References

  1. 3.
    F. Affentranger, R. Schneider, Random projections of regular simplices. Discrete Comput. Geom. 7(3), 219–226 (1992)MathSciNetMATHCrossRefGoogle Scholar
  2. 13.
    G. Anderson, A. Guionnet, O. Zeitouni, in An Introduction to Random Matrices. Cambridge Studies in Advanced Mathematics, vol. 118 (Cambridge University Press, Cambridge, 2010)Google Scholar
  3. 21.
    U. Ayaz, H. Rauhut, Nonuniform sparse recovery with subgaussian matrices. ETNA, to appearGoogle Scholar
  4. 24.
    B. Bah, J. Tanner, Improved bounds on restricted isometry constants for Gaussian matrices. SIAM J. Matrix Anal. Appl. 31(5), 2882–2898 (2010)MathSciNetMATHCrossRefGoogle Scholar
  5. 25.
    Z. Bai, J. Silverstein, Spectral Analysis of Large Dimensional Random Matrices. Springer Series in Statistics, 2nd edn. (Springer, New York, 2010)Google Scholar
  6. 31.
    R.G. Baraniuk, M. Davenport, R.A. DeVore, M. Wakin, A simple proof of the restricted isometry property for random matrices. Constr. Approx. 28(3), 253–263 (2008)MathSciNetMATHCrossRefGoogle Scholar
  7. 53.
    J. Blanchard, C. Cartis, J. Tanner, Compressed sensing: how sharp is the restricted isometry property? SIAM Rev. 53(1), 105–125 (2011)MathSciNetMATHCrossRefGoogle Scholar
  8. 86.
    E.J. Candès, Y.C. Eldar, D. Needell, P. Randall, Compressed sensing with coherent and redundant dictionaries. Appl. Comput. Harmon. Anal. 31(1), 59–73 (2011)MathSciNetMATHCrossRefGoogle Scholar
  9. 89.
    E.J. Candès, Y. Plan, Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements. IEEE Trans. Inform. Theor. 57(4), 2342–2359 (2011)CrossRefGoogle Scholar
  10. 91.
    E. J. Candès, B. Recht Simple bounds for recovering low-complexity models. Math. Program. Springer-Verlag, 1–13 (2012)Google Scholar
  11. 97.
    E.J. Candès, T. Tao, Near optimal signal recovery from random projections: universal encoding strategies? IEEE Trans. Inform. Theor. 52(12), 5406–5425 (2006)CrossRefGoogle Scholar
  12. 108.
    V. Chandrasekaran, B. Recht, P. Parrilo, A. Willsky, The convex geometry of linear inverse problems. Found. Comput. Math. 12(6), 805–849 (2012)MathSciNetMATHCrossRefGoogle Scholar
  13. 136.
    S. Dasgupta, A. Gupta, An elementary proof of a theorem of Johnson and Lindenstrauss. Random Struct. Algorithm. 22(1), 60–65, 2003MathSciNetMATHCrossRefGoogle Scholar
  14. 141.
    K. Davidson, S. Szarek, in Local Operator Theory, Random Matrices and Banach Spaces, ed. by W.B. Johnson, J. Lindenstrauss. Handbook of the Geometry of Banach Spaces, vol. 1 (North-Holland, Amsterdam, 2001), pp. 317–366Google Scholar
  15. 154.
    D.L. Donoho, High-dimensional centrally symmetric polytopes with neighborliness proportional to dimension. Discrete Comput. Geom. 35(4), 617–652 (2006)MathSciNetMATHCrossRefGoogle Scholar
  16. 162.
    D.L. Donoho, A. Maleki, A. Montanari, Message-passing algorithms for compressed sensing. Proc. Natl. Acad. Sci. USA 106(45), 18914–18919 (2009)CrossRefGoogle Scholar
  17. 165.
    D.L. Donoho, J. Tanner, Neighborliness of randomly projected simplices in high dimensions. Proc. Natl. Acad. Sci. USA 102(27), 9452–9457 (2005)MathSciNetMATHCrossRefGoogle Scholar
  18. 167.
    D.L. Donoho, J. Tanner, Counting faces of randomly-projected polytopes when the projection radically lowers dimension. J. Am. Math. Soc. 22(1), 1–53 (2009)MathSciNetMATHCrossRefGoogle Scholar
  19. 168.
    D.L. Donoho, J. Tanner, Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing. Philos. Trans. R. Soc. Lond. Ser. A Math. Phys. Eng. Sci. 367(1906), 4273–4293 (2009)MathSciNetMATHCrossRefGoogle Scholar
  20. 219.
    A. Garnaev, E. Gluskin, On widths of the Euclidean ball. Sov. Math. Dokl. 30, 200–204 (1984)MATHGoogle Scholar
  21. 235.
    Y. Gordon, On Milman’s inequality and random subspaces which escape through a mesh in R n. In Geometric Aspects of Functional Analysis (1986/87), Lecture Notes in Mathematics, vol. 1317 (Springer, Berlin, 1988), pp. 84–106Google Scholar
  22. 295.
    W.B. Johnson, J. Lindenstrauss, Extensions of Lipschitz mappings into a Hilbert space. In Conference in Modern Analysis and Probability (New Haven, Conn., 1982). Contemporary Mathematics, vol. 26 (American Mathematical Society, Providence, RI, 1984), pp. 189–206Google Scholar
  23. 299.
    B. Kashin, Diameters of some finite-dimensional sets and classes of smooth functions. Math. USSR, Izv. 11, 317–333 (1977)Google Scholar
  24. 309.
    F. Krahmer, R. Ward, New and improved Johnson-Lindenstrauss embeddings via the Restricted Isometry Property. SIAM J. Math. Anal. 43(3), 1269–1281 (2011)MathSciNetMATHCrossRefGoogle Scholar
  25. 351.
    S. Mendelson, A. Pajor, N. Tomczak-Jaegermann, Uniform uncertainty principle for Bernoulli and subgaussian ensembles. Constr. Approx. 28(3), 277–289 (2009)MathSciNetCrossRefGoogle Scholar
  26. 414.
    H. Rauhut, K. Schnass, P. Vandergheynst, Compressed sensing and redundant dictionaries. IEEE Trans. Inform. Theor. 54(5), 2210–2219 (2008)MathSciNetCrossRefGoogle Scholar
  27. 418.
    B. Recht, M. Fazel, P. Parrilo, Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM Rev. 52(3), 471–501 (2010)MathSciNetMATHCrossRefGoogle Scholar
  28. 433.
    M. Rudelson, R. Vershynin, On sparse reconstruction from Fourier and Gaussian measurements. Comm. Pure Appl. Math. 61, 1025–1045 (2008)MathSciNetMATHCrossRefGoogle Scholar
  29. 434.
    M. Rudelson, R. Vershynin, The Littlewood-Offord problem and invertibility of random matrices. Adv. Math. 218(2), 600–633 (2008)MathSciNetMATHCrossRefGoogle Scholar
  30. 435.
    M. Rudelson, R. Vershynin, Non-asymptotic theory of random matrices: extreme singular values. In Proceedings of the International Congress of Mathematicians, vol. 3 (Hindustan Book Agency, New Delhi, 2010), pp. 1576–1602Google Scholar
  31. 454.
    M. Stojnic, 1-optimization and its various thresholds in compressed sensing. In 2010 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP, pp. 3910–3913, 2010Google Scholar
  32. 498.
    A. Vershik, P. Sporyshev, Asymptotic behavior of the number of faces of random polyhedra and the neighborliness problem. Sel. Math. Sov. 11(2), 181–201 (1992)MathSciNetMATHGoogle Scholar
  33. 501.
    R. Vershynin, Introduction to the non-asymptotic analysis of random matrices. In Compressed Sensing: Theory and Applications, ed. by Y. Eldar, G. Kutyniok (Cambridge University Press, Cambridge, 2012), pp. xii+544Google Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Simon Foucart
    • 1
  • Holger Rauhut
    • 2
  1. 1.Department of MathematicsDrexel UniversityPhiladelphiaUSA
  2. 2.Lehrstuhl C für Mathematik (Analysis)RWTH Aachen UniversityAachenGermany

Personalised recommendations