Journal of Mathematical Imaging and Vision

, Volume 60, Issue 1, pp 128–144 | Cite as

An Extended Perona–Malik Model Based on Probabilistic Models

  • L. M. MeschederEmail author
  • D. A. Lorenz


The Perona–Malik model has been very successful at restoring images from noisy input. In this paper, we reinterpret the Perona–Malik model in the language of Gaussian scale mixtures and derive some extensions of the model. Specifically, we show that the expectation–maximization (EM) algorithm applied to Gaussian scale mixtures leads to the lagged-diffusivity algorithm for computing stationary points of the Perona–Malik diffusion equations. Moreover, we show how mean field approximations to these Gaussian scale mixtures lead to a modification of the lagged-diffusivity algorithm that better captures the uncertainties in the restoration. Since this modification can be hard to compute in practice, we propose relaxations to the mean field objective to make the algorithm computationally feasible. Our numerical experiments show that this modified lagged-diffusivity algorithm often performs better at restoring textured areas and fuzzy edges than the unmodified algorithm. As a second application of the Gaussian scale mixture framework, we show how an efficient sampling procedure can be obtained for the probabilistic model, making the computation of the conditional mean and other expectations algorithmically feasible. Again, the resulting algorithm has a strong resemblance to the lagged-diffusivity algorithm. Finally, we show that a probabilistic version of the Mumford–Shah segmentation model can be obtained in the same framework with a discrete edge prior.


Perona–Malik denoising Probabilistic models Mean field approximation Gaussian scale mixtures 



We would like to thank Sebastian Nowozin from Microsoft Research for some helpful literature hints.


  1. 1.
    Andrews, D.F., Mallows, C.L.: Scale mixtures of normal distributions. J. R. Stat. Soc. Ser. B (Methodological), 99–102 (1974)Google Scholar
  2. 2.
    Bao, Y., Krim, H.: Smart nonlinear diffusion: a probabilistic approach. IEEE Trans. Pattern Anal. Mach. Intell. 26(1), 63–72 (2004)CrossRefGoogle Scholar
  3. 3.
    Bardsley, J.M.: MCMC-based image reconstruction with uncertainty quantification. SIAM J. Sci. Comput. 34(3), A1316–A1332 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Bertero, M., Boccacci, P.: Introduction to Inverse Problems in Imaging. CRC Press, (1998)Google Scholar
  5. 5.
    Bezanson, J., Edelman, A., Karpinski, S., Shah, V.B.: Julia: a fresh approach to numerical computing. arXiv preprint arXiv:1411.1607, (2014)
  6. 6.
    Bishop, C.: Pattern Recognition and Machine Learning. Springer, New York (2006)zbMATHGoogle Scholar
  7. 7.
    Chan, T.F., Mulet, P.: On the convergence of the lagged diffusivity fixed point method in total variation image restoration. SIAM J. Numer. Anal. 36(2), 354–367 (1999)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Chen, Y., Pock, T: Trainable nonlinear reaction diffusion: a flexible framework for fast and effective image restoration. IEEE Trans. Pattern Anal. Mach. Intell., (2016)Google Scholar
  9. 9.
    Chen, Y., Yu, W., Pock, T.: On learning optimized reaction diffusion processes for effective image restoration. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 5261–5269, (2015)Google Scholar
  10. 10.
    Cohen, M.A., Grossberg, S.: Neural dynamics of brightness perception: Features, boundaries, diffusion, and resonance. Percept. Psychophys. 36(5), 428–456 (1984)CrossRefGoogle Scholar
  11. 11.
    Curtis, C.R., Oman, M.E.: Iterative methods for total variation denoising. SIAM J. Sci. Comput. 17(1), 227–238 (1996)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. J. R. Stat. Soc. Ser. B (methodological) 1-38 (1977)Google Scholar
  13. 13.
    Erdélyi, A., Magnus, W., Oberhettinger, F., Tricomi, F.G.: Tables of integral transforms. Vol. I. McGraw-Hill Book Company, Inc., New York-Toronto-London, 1954. Based, in part, on notes left by Harry BatemanGoogle Scholar
  14. 14.
    Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intell, PAMI-6(6):721–741, (1984). 17002Google Scholar
  15. 15.
    Grossberg, S.: Outline of a theory of brightness, color, and form perception. Adv. Psychol. 20, 59–86 (1984)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Grossberg, S., Mingolla, E.: Neural dynamics of perceptual grouping: textures, boundaries, and emergent segmentations. Percept. Psychophys. 38(2), 141–171 (1985)CrossRefGoogle Scholar
  17. 17.
    Järvenpää, M., Piché, R.:. Bayesian hierarchical model of total variation regularisation for image deblurring. arXiv:1412.4384 [math, stat], (2014)
  18. 18.
    Jin, B., Zou, J.: Augmented Tikhonov regularization. Inverse Probl., 25(2):025001, 25, (2009)Google Scholar
  19. 19.
    Kaipio, J., Somersalo, E.: Statistical and computational inverse problems. Number v. 160 in Applied mathematical sciences. Springer, New York, (2005)Google Scholar
  20. 20.
    Krajsek, K., Scharr, H.:. Diffusion filtering without parameter tuning: models and inference tools. In: Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on, pp. 2536–2543. IEEE, (2010)Google Scholar
  21. 21.
    Krim, H., Bao, Y.: Nonlinear diffusion: a probabilistic view. In: Image Processing, 1999. ICIP 99. Proceedings. 1999 International Conference on, vol. 2, pp. 21–25. IEEE, (1999)Google Scholar
  22. 22.
    Morini, M., Negri, M.: Mumford–Shah functional as \(\gamma \)-limit of discrete Perona–Malik energies. Math. Models Methods Appl. Sci. 13(06), 785–805 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    Mumford, D., Shah, J.: Optimal approximations by piecewise smooth functions and associated variational problems. Commun. Pure Appl. Math. 42(5), 577–685 (1989)MathSciNetCrossRefzbMATHGoogle Scholar
  24. 24.
    Murphy, K.P.: Machine Learning: A Probabilistic Perspective, Adaptive Computation and Machine Learning Series. MIT Press, Cambridge, MA (2012)zbMATHGoogle Scholar
  25. 25.
    Niklas Nordström, K.: Biased anisotropic diffusion: a unified regularization and diffusion approach to edge detection. Image Vis. Comput. 8(4), 318–327 (1990)CrossRefGoogle Scholar
  26. 26.
    Papandreou, G., Yuille, A.L.: Gaussian sampling by local perturbations. In: Lafferty, J.D., Williams, C.K.L., Shawe-Taylor, J., Zemel, R.S., Culotta, A. (editors), Advances in Neural Information Processing Systems 23, pp. 1858–1866. Curran Associates, Inc., (2010)Google Scholar
  27. 27.
    Perona, P., Malik, J.: Scale-space and edge detection using anisotropic diffusion. IEEE Trans. Pattern Anal. Mach. Intell 12(7), 629–639 (1990)CrossRefGoogle Scholar
  28. 28.
    Peter, P., Weickert, J., Munk, A., Krivobokova, T., Li, H.: Justifying tensor-driven diffusion from structure-adaptive statistics of natural images. In: International Workshop on Energy Minimization Methods in Computer Vision and Pattern Recognition, pp. 263–277. Springer, (2015)Google Scholar
  29. 29.
    Pizurica, A., Vanhamel, I., Sahli, H., Philips, W., Katartzis, A.: A Bayesian formulation of edge-stopping functions in nonlinear diffusion. IEEE Signal Process. Lett. 13(8), 501–504 (2006)CrossRefGoogle Scholar
  30. 30.
    Portilla, J., Strela, V., Wainwright, M.J., Simoncelli, E.P.: Image denoising using scale mixtures of Gaussians in the wavelet domain. IEEE Trans. Image Process. 12(11), 1338–1351 (2003)Google Scholar
  31. 31.
    Rudin, L.I., Osher, S., Fatemi, E.: Nonlinear total variation based noise removal algorithms. Phys. D 60(1), 259–268 (1992)MathSciNetCrossRefzbMATHGoogle Scholar
  32. 32.
    Rue, H., Held, L.: Gaussian Markov Random Fields: Theory and Applications. CRC Press, Boca Raton (2005)CrossRefzbMATHGoogle Scholar
  33. 33.
    Scharr, H., Black, M.J., Haussecker, H.W.: Image statistics and anisotropic diffusion. In: International Conference on Computer Vision (ICCV), 2003, pp. 840–847. ICCV, (2003)Google Scholar
  34. 34.
    Scherzer, O., Grasmair, M., Grossauer, H., Haltmeier, M., Lenzen, F.: Variational methods in imaging, Applied Mathematical Sciences, vol. 167. Springer, New York (2009)zbMATHGoogle Scholar
  35. 35.
    Schmidt, U., Gao, Q., Roth, S.: A generative perspective on MRFs in low-level vision. In: Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on, pp. 1751–1758. IEEE, (2010)Google Scholar
  36. 36.
    Wainwright, M.J., Jordan, M.I.: Graphical models, exponential families, and variational inference. Found. Trends Mach. Learn. 1(1–2), 1–305 (2008)zbMATHGoogle Scholar
  37. 37.
    Wainwright, M.P., Simoncelli, E.P.: Scale mixtures of Gaussians and the statistics of natural images. In NIPS, pp. 855–861, (1999)Google Scholar
  38. 38.
    Weickert, J.: Anisotropic Diffusion in Image Processing. Teubner, Stuttgart (1998)zbMATHGoogle Scholar
  39. 39.
    Yu, G., Sapiro, G., Mallat, S.: Image modeling and enhancement via structured sparse model selection. In: 2010 IEEE International Conference on Image Processing, pp. 1641–1644. IEEE, (2010)Google Scholar
  40. 40.
    Zhu, S.C., Mumford, D.: Prior learning and gibbs reaction-diffusion. IEEE Trans. Pattern Anal. Mach. Intell. 19(11), 1236–1250 (1997)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2017

Authors and Affiliations

  1. 1.Autonomous Vision GroupMPI TübingenTübingenGermany
  2. 2.Institute for Analysis and AlgebraTU BraunschweigBraunschweigGermany

Personalised recommendations