Advertisement

Sparse Recovery Algorithms: Sufficient Conditions in Terms of Restricted Isometry Constants

  • Simon Foucart
Conference paper
Part of the Springer Proceedings in Mathematics book series (PROM, volume 13)

Abstract

We review three recovery algorithms used in Compressive Sensing for the reconstruction s-sparse vectors x N from the mere knowledge of linear measurements y=A x m , m<N. For each of the algorithms, we derive improved conditions on the restricted isometry constants of the measurement matrix A that guarantee the success of the reconstruction. These conditions are δ2s <0.4652 for basis pursuit, δ3s <0.5 and δ2s <0.25 for iterative hard thresholding, and δ4s <0.3843 for compressive sampling matching pursuit. The arguments also applies to almost sparse vectors and corrupted measurements. The analysis of iterative hard thresholding is surprisingly simple. The analysis of basis pursuit features a new inequality that encompasses several inequalities encountered in Compressive Sensing.

Keywords

Compressive Sensing Convex Polytope Basis Pursuit Recovery Algorithm Restricted Isometry Property 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

Acknowledgements

The author thanks the meeting organizers, Mike Neamtu and Larry Schumaker, for welcoming a minisymposium on Compressive Sensing at the Approximation Theory conference. He also acknowledges support from the French National Research Agency (ANR) through project ECHANGE (ANR-08-EMER-006).

References

  1. 1.
    Cai, T.T., Wang, L., Xu, G.: Shifting inequality and recovery of sparse signals. IEEE Transactions on Signal Processing 58, 1300–1308 (2010).MathSciNetCrossRefGoogle Scholar
  2. 2.
    Candès, E.J.: The restricted isometry property and its implications for compressed sensing. Comptes Rendus de l’Académie des Sciences, Série I, 346, 589–592 (2008).MATHGoogle Scholar
  3. 3.
    Candès, E., Tao. T.: Decoding by linear programing. IEEE Trans. Inf. Theory 51, 4203–4215 (2005).Google Scholar
  4. 4.
    Davies, M.E., Blumensath, T.: Iterative hard thresholding for compressed sensing. Appl. Comput. Harmon. Anal. 27, 265–274 (2009).MathSciNetMATHCrossRefGoogle Scholar
  5. 5.
    Foucart, S.: A note on guaranteed sparse recovery via 1-minimization. Applied and Comput. Harmonic Analysis, To appear. Appl. Comput. Harmon. Anal. 29, 97–103 (2010).Google Scholar
  6. 6.
    Garg, R., Khandekar, R.: Gradient descent with sparsification: An iterative algorithm for sparse recovery with restricted isometry property. In: Bottou, L., Littman, M. (eds.) Proceedings of the 26 th International Confer- ence on Machine Learning, pp. 337-344.Google Scholar
  7. 7.
    Needell, D., Tropp, J.A.: CoSaMP: Iterative signal recovery from incomplete and inaccurate samples. Appl. Comput. Harmon. Anal. 26 301–321 (2009).MathSciNetMATHCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2012

Authors and Affiliations

  1. 1.Department of MathematicsDrexel UniversityPhiladelphiaUSA

Personalised recommendations