Total Variation Restoration of Images Corrupted by Poisson Noise with Iterated Conditional Expectations

  • Rémy Abergel
  • Cécile Louchet
  • Lionel Moisan
  • Tieyong Zeng
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9087)

Abstract

Interpreting the celebrated Rudin-Osher-Fatemi (ROF) model in a Bayesian framework has led to interesting new variants for Total Variation image denoising in the last decade. The Posterior Mean variant avoids the so-called staircasing artifact of the ROF model but is computationally very expensive. Another recent variant, called TV-ICE (for Iterated Conditional Expectation), delivers very similar images but uses a much faster fixed-point algorithm. In the present work, we consider the TV-ICE approach in the case of a Poisson noise model. We derive an explicit form of the recursion operator, and show linear convergence of the algorithm, as well as the absence of staircasing effect. We also provide a numerical algorithm that carefully handles precision and numerical overflow issues, and show experiments that illustrate the interest of this Poisson TV-ICE variant.

Keywords

Poisson noise removal Image denoising Total variation Posterior mean Marginal conditional mean Staircasing effect Fixed-point algorithm Incomplete gamma function 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Rudin, L.I., Osher, S., Fatemi, E.: Nonlinear total variation based noise removal algorithms. Physica D 60(1), 259–268 (1992)CrossRefMATHGoogle Scholar
  2. 2.
    Caselles, V., Chambolle, A., Novaga, M.: Total variation in imaging. In: Handbook of Mathematical Methods in Imaging, pp. 1016–1057. Springer, New York (2011)Google Scholar
  3. 3.
    Chambolle, A., Caselles, V., Cremers, D., Novaga, M., Pock, T.: An introduction to total variation for image analysis. Theoretical foundations and numerical methods for sparse recovery 9, 263–340 (2010)MathSciNetGoogle Scholar
  4. 4.
    Darbon, J., Sigelle, M.: Image restoration with discrete constrained total variation part I: Fast and exact optimization. J. Math. Imag. Vis. 26(3), 261–276 (2007)CrossRefMathSciNetGoogle Scholar
  5. 5.
    Chambolle, A., Pock, T.: A first-order primal-dual algorithm for convex problems with applications to imaging. J. Math. Imag. Vis. 40(1), 120–145 (2011)CrossRefMATHMathSciNetGoogle Scholar
  6. 6.
    Buades, A., Coll, B., Morel, J.-M.: A review of image denoising algorithms, with a new one. Multiscale Model. Simul. 4(2), 490–530 (2005)CrossRefMATHMathSciNetGoogle Scholar
  7. 7.
    Dabov, K., Foi, A., Katkovnik, V., Egiazarian, K.: Image denoising by sparse 3-D transform-domain collaborative filtering. IEEE Trans. Image Processing 16(8), 2080–2095 (2007)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Louchet, C., Moisan, L.: Total variation denoising using posterior expectation. In: Proc, European Signal Processing Conf (2008)Google Scholar
  9. 9.
    Louchet, C., Moisan, L.: Posterior Expectation of the Total Variation model: Properties and Experiments. SIAM J. Imaging Sci. 6(4), 2640–2684 (2013)CrossRefMATHMathSciNetGoogle Scholar
  10. 10.
    Louchet, C., Moisan, L.: Total variation denoising using iterated conditional expectation. In: Proc, European Signal Processing Conf (2014)Google Scholar
  11. 11.
    Setzer, S., Steidl, G., Teuber, T.: Deblurring Poissonian images by split Bregman techniques. J. Vis. Comm. Image Representation 21(3), 193–199 (2010)CrossRefMathSciNetGoogle Scholar
  12. 12.
    Deledalle, C., Tupin, F., Denis, L.: Poisson NL means: Unsupervised non local means for poisson noise. In: Proc. Int. Conf. Imag. Processing, pp. 801–804 (2010)Google Scholar
  13. 13.
    Schmidt, K.D.: On the covariance of monotone functions of a random variable. Unpublished note, University of Dresden (2003)Google Scholar
  14. 14.
    Olver, F.W.J., Lozier, D.W., Boisvert, R.F., Clark, C.W. (eds.): NIST Handbook of Mathematical Functions. Cambridge University Press, New York (2010)MATHGoogle Scholar
  15. 15.
    NIST Digital Library of Mathematical Functions (2014). http://dlmf.nist.gov/ (release 1.0.9 of August 29, 2014)
  16. 16.
    Cuyt, A., Petersen, V.B., Verdonk, B., Waadeland, H., Jones, W.B.: Handbook of continued fractions for special functions. Springer, New York (2008)MATHGoogle Scholar
  17. 17.
    Jones, W.B., Thron, W.J.: Continued Fractions: Analytic Theory and Applications. Encyclopedia of Mathematics and its Applications, vol. 11. Addison-Wesley Publishing Co., Reading, MA (1980)Google Scholar
  18. 18.
    Numerical recipes: The art of scientific computing, 2nd edn. Cambridge University (2007)Google Scholar
  19. 19.
    Abramowitz, M., Stegun, I.A.: Handbook of mathematical functions: with formulas, graphs, and mathematical tables. Courier Dover Publications no. 55 (1972)Google Scholar
  20. 20.
    Csiszar, I.: Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems. Ann. Stat. 19, 2032–2066 (1991)CrossRefMATHMathSciNetGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Rémy Abergel
    • 1
  • Cécile Louchet
    • 2
  • Lionel Moisan
    • 1
  • Tieyong Zeng
    • 3
  1. 1.MAP5 (CNRS UMR 8145)Université Paris DescartesParisFrance
  2. 2.MAPMO (CNRS UMR 6628)Université d’OrléansParisFrance
  3. 3.Department of MathematicsHong Kong Baptist UniversityKowloon TongHong Kong

Personalised recommendations