Skip to main content
Log in

A Modified Newton Projection Method for \(\ell _1\)-Regularized Least Squares Image Deblurring

  • Published:
Journal of Mathematical Imaging and Vision Aims and scope Submit manuscript

Abstract

In recent years, \(\ell _1\)-regularized least squares have become a popular approach to image deblurring due to the edge-preserving property of the \(\ell _1\)-norm. In this paper, we consider the nonnegatively constrained quadratic program reformulation of the \(\ell _1\)-regularized least squares problem and we propose to solve it by an efficient modified Newton projection method only requiring matrix–vector operations. This approach favors nonnegative solutions without explicitly imposing any constraints in the \(\ell _1\)-regularized least squares problem. Experimental results on image deblurring test problems indicate that the developed approach performs well in comparison with state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Afonso, M.V., Bioucas-Dias, J.M., Figueiredo, M.A.T.: Fast image recovery using variable splitting and constrained optimization. IEEE Trans. Image Process. 19, 2345–2356 (2010)

    Article  MathSciNet  Google Scholar 

  2. Afonso, M.V., Bioucas-Dias, J.M., Figueiredo, M.A.T.: An augmented Lagrangian approach to the constrained optimization formulation of imaging inverse problems. IEEE Trans. Image Process. 20(3), 681–695 (2011)

    Article  MathSciNet  Google Scholar 

  3. Andrew, G., Gao, J.F.: Scalable training of l1-regularized log-linear models. In: Proceedings of the 24th International Conference on Machine Learning (2007)

  4. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2(1), 183–202 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  5. Becker, S., Bobin, J., Candès, E.J.: NESTA: a fast and accurate first-order method for sparse recovery. SIAM J. Imaging Sci. 4, 1–39 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  6. Bertero, M., Boccacci, P.: Introduction to Inverse Problems in Imaging. IOP Publishing, Bristol (1998)

    Book  MATH  Google Scholar 

  7. Bertsekas, D.: Constrained Optimization and Lagrange Multiplier Methods. Academic Press, New York (1982)

    MATH  Google Scholar 

  8. Bertsekas, D.: Projected Newton methods for optimization problem with simple constraints. SIAM J. Control Optim. 20(2), 221–245 (1982)

    Article  MATH  MathSciNet  Google Scholar 

  9. Bertsekas, D.: Nonlinear Programming, 2nd edn. Athena Scientific, Belmont (1999)

    MATH  Google Scholar 

  10. Bioucas-Dias, J., Figueiredo, M.: A new twist: two-step iterative shrinkage/thresholding algorithms for image restoration. IEEE Trans. Image Process. 16(12), 29923004 (2007)

    Article  Google Scholar 

  11. Bloomfield, P., Steiger, W.: Least Absolute Deviations: Theory, Applications, and Algorithms. Progress in Probability and Statistics. Birkhäuser, Boston (1983)

    Google Scholar 

  12. Bonettini, S., Landi, G., Loli Piccolomini, E., Zanni, L.: Scaling techniques for gradient projection-type methods in astronomical image deblurring. Int. J. Comput. Math. 90(1), 9–29 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  13. Cai, Y., Sun, Y., Cheng, Y., Li, J., Goodison, S.: Fast implementation of l1 regularized learning algorithms using gradient descent methods. In: Proceedings of the 10th SIAM International Conference on Data Mining (SDM), pp. 862–871 (2010)

  14. Candès, E., Romberg, J.: Quantitative robust uncertainty principles and optimally sparse decompositions. Found. Comput. Math. 6, 227–254 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  15. Candès, E., Romberg, J.: Sparsity and incoherence in compressive sampling. Invest. Probl. 23, 969–985 (2007)

    Article  MATH  Google Scholar 

  16. Candès, E., Tao, T.: Near-optimal signal recovery from random projections: universal encoding strategies? IEEE Trans. Inf. Theory 52(12), 5406–5425 (2006)

    Article  Google Scholar 

  17. Candès, E., Romberg, J., Tao, T.: Stable signal recovery from incomplete and inaccurate measurements. Commun. Pure Appl. Math. 59(8), 12071223 (2005)

    Google Scholar 

  18. Candès, E., Romberg, J., Tao, T.: Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inf. Theory 52(2), 489509 (2006)

    Article  Google Scholar 

  19. Chen, S., Donoho, D., Saunders, M.: Atomic decomposition by basis pursuit. SIAM J. Sci. Comput. 20, 33–61 (1998)

    Article  MathSciNet  Google Scholar 

  20. Daubechies, I., De Friese, M., De Mol, C.: An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. Commun. Pure Appl. Math. 57, 14131457 (2004)

    Article  Google Scholar 

  21. Donoho, D.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 12891306 (2006)

    Article  MathSciNet  Google Scholar 

  22. Figueiredo, M.A.T., Nowak, R.D., Wright, S.J.: Gradient projection for sparse reconstruction: application to compressed sensing and other inverse problems. IEEE J. Sel. Top. Signal Process. 1, 586–597 (2007)

    Article  Google Scholar 

  23. Fu, H., Ng, M.K., Nikolova, M., Barlow, J.L.: Efficient minimization methods of mixed l2-l1 and l1-l1 norms for image restoration. SIAM J. Sci. Comput. 27(6), 1881–1902 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  24. Gafni, E.M., Bertsekas, D.P.: Two-metric projection methods for constrained optimization. SIAM J. Control. Optim. 22, 936–964 (1984)

    Article  MATH  MathSciNet  Google Scholar 

  25. Goldstein, T., Osher, S.: The split Bregman method for l1 regularized problems. SIAM J. Imag. Sci. 2(2), 323–343 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  26. Goldstein, T., Setzer, S.: High-order methods for basis pursuit. Technical Report CAM report 10–41, UCLA (2010)

  27. Gong, P., Zhang, C.: A fast dual projected Newton method for l1-regularized least squares. In: Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence, IJCAI’11, vol. 2, pp. 1275–1280 (2011)

  28. Guerquin-Kern, M., Baritaux, J.-C., Unser, M.: Efficient image reconstruction under sparsity constraints with application to MRI and bioluminescence tomography. In ICASSP, IEEE, pp. 5760–5763 (2011)

  29. Guo, X., Li, F., Ng, M.K.: A fast l1-TV algorithm for image restoration. SIAM J. Sci. Comput. 31(3), 2322–2341 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  30. Hager, W.W., Phan, D.T., Zhang, H.: Gradient-based methods for sparse recovery. SIAM J. Imaging Sci. 4(1), 146–165 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  31. Hale, E.T., Yin, W., Zhang, Y.: A fixed-point continuation method for l1-regularized minimization with applications to compressed sensing. Technical Report CAAM TR07-07, Rice University, Houston, TX (2007)

  32. Hale, E.T., Yin, W., Zhang, Y.: Fixed-point continuation for l1-minimization: methodology and convergence. SIAM J. Optim. 19, 1107–1130 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  33. Hansen, P.C., Nagy, J., O’Leary, D.P.: Deblurring Images. Matrices, Spectra and Filtering. SIAM, Philadelphia (2006)

    Book  MATH  Google Scholar 

  34. Kim, J., Park, H.: Fast active-set-type algorithms for l1-regularized linear regression. In: Proceedings of the AISTAT, pp. 397–404 (2010)

  35. Kim, S.J., Koh, K., Lustig, M., Boyd, S., Gorinevsky, D.: An interior-point method for large-scale l1-regularized least squares. IEEE J. Sel. Top. Signal Process. 1, 606–617 (2007)

    Article  Google Scholar 

  36. Kuo, S.-S., Mammone, R.J.: Image restoration by convex projections using adaptive constraints and the l1 norm. Trans. Sig. Proc. 40(1), 159 (1992)

    Article  Google Scholar 

  37. Landi, G., Loli Piccolomini, E.: Quasi-Newton projection methods and the discrepancy principle in image restoration. Appl. Math. Comput. 218(5), 2091–2107 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  38. Landi, G., Loli Piccolomini, E.: An efficient method for nonnegatively constrained total variation-based denoising of medical images corrupted by poisson noise. Comput. Med. Imaging Graph. 36(1), 38–46 (2012)

    Article  MathSciNet  Google Scholar 

  39. Landi, G., Loli Piccolomini, E.: An improved Newton projection method for nonnegative deblurring of Poisson-corrupted images with Tikhonov regularization. Numer. Algorithms 60(1), 169–188 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  40. Landi, G., Loli Piccolomini, E.: NPTool: a Matlab software for nonnegative image restoration with Newton projection methods. Numer. Algorithms 62(3), 487–504 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  41. Lavrentiev, M.M.: Some Improperly Posed Problems of Mathematical Physics. Springer, New York (1967)

    Book  MATH  Google Scholar 

  42. Loris, I., Bertero, M., De Mol, C., Zanella, R., Zanni, L.: Accelerating gradient projection methods for l1-constrained signal recovery by steplength selection rules. Appl. Comput. Harmon. Anal. 27(2), 247–254 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  43. Nikolova, M.: Minimizers of cost-functions involving nonsmooth data-fidelity terms. Application to the processing of outliers. SIAM J. Numer. Anal. 40, 965994 (2002)

    Article  MathSciNet  Google Scholar 

  44. Schmidt, M.: Graphical model structure learning with L1-regularization. Ph.D. thesis, University of British Columbia (2010)

  45. Schmidt, M., Fung, G., Rosalesm, R.: Fast optimization methods for l1 regularization: a comparative study and two new approaches. In: Proceedings of the 18th European Conference on Machine Learning, ECML ’07, Springer, pp. 286–297 (2007)

  46. Schmidt, M., Fung, G., Rosales, R.: Optimization methods for l1-regularization. Technical Report TR-2009-19, UBC (2009)

  47. Schmidt, M., Kim, D., Sra, S.: Optimization for Machine Learning. Projected Newton-type Methods in Machine Learning. MIT Press, Cambridge (2011)

    Google Scholar 

  48. Tautenhahn, U.: On the method of Lavrentiev regularization for nonlinear illposed problems. Invest. Probl. 18, 191–207 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  49. Tibshirani, R.: Regression shrinkage and selection via the Lasso. J. R. Stat. Soc. Ser. B 58(1), 267288 (1996)

    Google Scholar 

  50. Tropp, J.: Just relax: convex programming methods for identifying sparse signals in noise. IEEE Trans. Inf. Theory 52(3), 10301051 (2006)

    Article  MathSciNet  Google Scholar 

  51. Tikhonov, A.N., Arsenin, V.Y.: Solutions of Ill-Posed Problems. Wiley, New York (1977)

    MATH  Google Scholar 

  52. Tibshirani, R., Hastie, T., Friedman, J.: The Elements of Statistical Learning. Springer Series in Statistics. Springer, New York (2001)

    Google Scholar 

  53. van den Berg, E., Friedlander, M.P.: Probing the Pareto frontier for basis pursuit solutions. SIAM J. Sci. Comput. 31, 890–912 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  54. Vogel, C.R., Oman, M.E.: Iterative methods for total variation denoising. SIAM J. Sci. Comput. 17, 227–238 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  55. Vogel, C.R.: Computational Methods for Inverse Problems. SIAM, Philadelphia (2002)

    Book  MATH  Google Scholar 

  56. Wright, S., Nowak, R., Figueiredo, M.: Sparse reconstruction by separable approximation. IEEE Trans. Signal Process. 57(7), 24792493 (2009)

    Article  MathSciNet  Google Scholar 

  57. Yin, W., Osher, S., Goldfarb, D., Darbon, J.: Bregman iterative algorithms for l1-minimization with applications to compressed sensing. SIAM J. Imaging Sci. 1, 143–168 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  58. Zhang, J.J., Morini, B.: Solving regularized linear least-squares problems by alternating direction methods with applications to image restoration. ETNA 40, 356–372 (2013)

    MATH  MathSciNet  Google Scholar 

Download references

Acknowledgments

The author is grateful to the two anonymous referees for many helpful and perceptive suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to G. Landi.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Landi, G. A Modified Newton Projection Method for \(\ell _1\)-Regularized Least Squares Image Deblurring. J Math Imaging Vis 51, 195–208 (2015). https://doi.org/10.1007/s10851-014-0514-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10851-014-0514-3

Keywords

Navigation