Skip to main content
Log in

A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems since they have attractive practical factors such as simple computation, low memory requirement and strong global convergence property. Based on the sufficient descent property and the Dai-Liao conjugate condition, a class of new three-term descent conjugate gradient algorithms are proposed. The proposed algorithms automatically have the sufficient descent property and satisfy the conjugate condition. Under the standard Wolfe line search technique and some common conditions, the global convergence of the proposed algorithms for uniformly convex function and general nonlinear function are established. Numerical results also indicate that the proposed algorithms are more efficient and reliable than the other methods for the testing problems. At last, the proposed methods are applied to some image restoration problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 2, 149–154 (1964)

    MathSciNet  MATH  Google Scholar 

  2. Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)

    MATH  Google Scholar 

  3. Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Franç,aise Informat. Recherche Opérationnelle 16 (3), 35–43 (1969)

    MATH  Google Scholar 

  4. Dai, Y., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    MathSciNet  MATH  Google Scholar 

  5. Hestenes, R, Stiefel, L.: Methods of conjugate gradients for solving linear systems. J. R. National Bureau Standards (United States) 49(6), 409–436 (1952)

    MathSciNet  MATH  Google Scholar 

  6. Gilbert, J., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    MathSciNet  MATH  Google Scholar 

  7. Yuan, G, Wei, Z, Lu, X.: Global convergence of BFGS and PRP methods under a modified weak Wolfe–Powell line search. Appl. Math. Model. 47, 811–825 (2017)

    MathSciNet  MATH  Google Scholar 

  8. Yuan, G, Lu, J, Wang, Z.: The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems. Appl. Numer. Math. 152, 1–11 (2020)

    MathSciNet  MATH  Google Scholar 

  9. Dai, Y., Han, J., Liu, G., Sun, D., Yin, H., Yuan, Y.: Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 10(2), 345–358 (2000)

    MathSciNet  MATH  Google Scholar 

  10. Yuan, G., Wang, X., Sheng, Z.: Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions. Numer. Alg. 84 (3), 935–956 (2020)

    MathSciNet  MATH  Google Scholar 

  11. Hager, W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  12. Andrei, N.: Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Studies Inform. Contr. 16(4), 333–352 (2007)

    Google Scholar 

  13. Zhang, K, Liu, H, Liu, Z.: A new Dai-Liao conjugate gradient method with optimal parameter choice. Numer. Functional Anal. Optim. 40(2), 194–215 (2019)

    MathSciNet  MATH  Google Scholar 

  14. Li, X., Wang, X., Sheng, Z., Duan, X.: A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations. Int. J. Comput. Math. 95(2), 382–395 (2018)

    MathSciNet  MATH  Google Scholar 

  15. Andrei, N.: Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bullet. Malaysian Math. Society Series 2 (2), 319–330 (2011)

    MathSciNet  MATH  Google Scholar 

  16. Dai, Y., Liao, L.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)

    MathSciNet  MATH  Google Scholar 

  17. Andrei, N.: A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues. Numer. Alg. 77, 1273–1282 (2018)

    MathSciNet  MATH  Google Scholar 

  18. Babaie-kafaki, S., Ghanbari, R.: The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)

    MathSciNet  MATH  Google Scholar 

  19. Yuan, G., Wang, X., Sheng, Z.: The projection technique for two open problems of unconstrained optimization problems. J. Optim. Theory Appl. 186, 590–619 (2020)

    MathSciNet  MATH  Google Scholar 

  20. Yuan, G., Meng, Z., Li, Y.: A modified hestenes and stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 168, 129–152 (2016)

    MathSciNet  MATH  Google Scholar 

  21. Peyghami, M., Ahmadzadeh, H., Fazli, A.: A new class of efficient and globally convergent conjugate gradient methods in the Dai-Liao family. Optim. Methods Softw. 30(4), 843–863 (2015)

    MathSciNet  MATH  Google Scholar 

  22. Yao, S., He, D., Shi, L.: An improved perry conjugate gradient method with adaptive parameter choice. Numer. Alg. 78, 1255–1269 (2018)

    MathSciNet  MATH  Google Scholar 

  23. Zheng, Y., Zheng, B.: Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems. J. Optim. Theory Appl. 175(2), 502–509 (2017)

    MathSciNet  MATH  Google Scholar 

  24. Hager, W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    MathSciNet  MATH  Google Scholar 

  25. Dai, Y., Kou, C.: A nonlinear conjugate gradient algorithm with an optimal property and an improved wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    MathSciNet  MATH  Google Scholar 

  26. Al-Baali, M., Narushima, Y., Yabe, H.: A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization. Comput. Optim. Appl. 60(1), 89–110 (2015)

    MathSciNet  MATH  Google Scholar 

  27. Eslahchi, M., Bojari, S.: Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization. Optim. Methods Softw., pp. 1–14 (2020)

  28. Narushima, Y., Yabe, H., Ford, J.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21(1), 212–230 (2011)

    MathSciNet  MATH  Google Scholar 

  29. Zhang, L., Zhou, W., Li, D.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)

    MathSciNet  MATH  Google Scholar 

  30. Andrei, N.: A simple three-term conjugate gradient algorithm for unconstrained optimization. J. Comput. Appl. Math. 241, 19–29 (2013)

    MathSciNet  MATH  Google Scholar 

  31. Yao, S., Feng, Q., Li, L., Xu, J.: A class of globally convergent three-term Dai-Liao conjugate gradient methods. Appl. Numer. Math. 151, 354–366 (2020)

    MathSciNet  MATH  Google Scholar 

  32. Tian, Q., Wang, X.L., Pang, L.P., Zhang, M.K., Meng, F.Y.: A new hybrid three-term conjugate gradient algorithm for large-scale unconstrained problems. Mathematics 9(12), 1–16 (2021)

    Google Scholar 

  33. Andrei, N.: Nonlinear conjugate gradient methods for unconstrained optimization. Springer Optim. Appl., vol. 158 (2020)

  34. Wei, Z., Li, G., Qi, L.: New quasi-newton methods for unconstrained optimization problems. Appl. Math. Comput. 175(2), 1156–1188 (2006)

    MathSciNet  MATH  Google Scholar 

  35. Yuan, G., Wei, Z.: Convergence analysis of a modified BFGS method on convex minimizations. Comput. Optim. Appl. 47(2), 237–255 (2010)

    MathSciNet  MATH  Google Scholar 

  36. Li, D., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1-2), 15–35 (2001)

    MathSciNet  MATH  Google Scholar 

  37. Babaie-Kafaki, S., Ghanbari, R.: A modified scaled conjugate gradient method with global convergence for nonconvex functions. Bullet. Belgian Math. Society Simon Stevin 21(3), 465–477 (2014)

    MathSciNet  MATH  Google Scholar 

  38. Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Alg. 42(1), 63–73 (2006)

    MathSciNet  MATH  Google Scholar 

  39. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  40. Dolan, E., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    MathSciNet  MATH  Google Scholar 

  41. Watkins, S.: Fundamentals of matrix computations. Wiley, New York (2002)

    MATH  Google Scholar 

  42. Babaie-Kafaki, S.: A hybrid scaling parameter for the scaled memoryless BFGS method based on the \(\ell _{\infty }\) matrix norm. Int. J. Comput. Math. 96(8), 1595–1602 (2019)

    MathSciNet  MATH  Google Scholar 

  43. Yu, G., Huang, J., Zhou, Y.: A descent spectral conjugate gradient method for impulse noise removal. Appl. Math. Lett. 23(5), 555–560 (2010)

    MathSciNet  MATH  Google Scholar 

  44. Cai, J., Chan, R., Fiore, C.: Minimization of a detail-preserving regularization functional for impulse noise removal. J. Math. Imaging Vis. 29(1), 79–91 (2007)

    MathSciNet  Google Scholar 

  45. Bovik, A.: Handbook of image and video processing, academic, New York (2000)

Download references

Acknowledgements

The authors are very much indebted and grateful to the editors and anonymous referees for their valuable comments and suggestions which improved the quality of this paper.

Funding

This work is supported by the Science Foundation of Zhejiang Sci-Tech University (ZSTU) under Grant No. 21062347-Y, the National Natural Science Foundation of China (Grant No. 11661009), the High Level Innovation Teams and Excellent Scholars Program in Guangxi institutions of higher education of China (Grant No. [2019]52), the Guangxi Natural Science Key Foundation (No. 2017GXNSFDA198046), the Special Funds for Local Science and Technology Development Guided by the Central Government (No. ZY20198003) and Special Foundation for Guangxi Ba Gui Scholars. This work is also supported by the National Natural Science Foundation of China (No. 11801503) and Zhejiang Provincial Natural Science Foundation of China (No. LY20A010025).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaoliang Wang.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Additional information

Data availability

The datasets generated during the current study are available from the corresponding author on reasonable request.

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, X., Yuan, G. & Pang, L. A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems. Numer Algor 93, 949–970 (2023). https://doi.org/10.1007/s11075-022-01448-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-022-01448-y

Keywords

Mathematics Subject Classification (2010)

Navigation