Abstract
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems since they have attractive practical factors such as simple computation, low memory requirement and strong global convergence property. Based on the sufficient descent property and the Dai-Liao conjugate condition, a class of new three-term descent conjugate gradient algorithms are proposed. The proposed algorithms automatically have the sufficient descent property and satisfy the conjugate condition. Under the standard Wolfe line search technique and some common conditions, the global convergence of the proposed algorithms for uniformly convex function and general nonlinear function are established. Numerical results also indicate that the proposed algorithms are more efficient and reliable than the other methods for the testing problems. At last, the proposed methods are applied to some image restoration problems.
Similar content being viewed by others
References
Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 2, 149–154 (1964)
Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)
Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Franç,aise Informat. Recherche Opérationnelle 16 (3), 35–43 (1969)
Dai, Y., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Hestenes, R, Stiefel, L.: Methods of conjugate gradients for solving linear systems. J. R. National Bureau Standards (United States) 49(6), 409–436 (1952)
Gilbert, J., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Yuan, G, Wei, Z, Lu, X.: Global convergence of BFGS and PRP methods under a modified weak Wolfe–Powell line search. Appl. Math. Model. 47, 811–825 (2017)
Yuan, G, Lu, J, Wang, Z.: The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems. Appl. Numer. Math. 152, 1–11 (2020)
Dai, Y., Han, J., Liu, G., Sun, D., Yin, H., Yuan, Y.: Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 10(2), 345–358 (2000)
Yuan, G., Wang, X., Sheng, Z.: Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions. Numer. Alg. 84 (3), 935–956 (2020)
Hager, W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2(1), 35–58 (2006)
Andrei, N.: Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Studies Inform. Contr. 16(4), 333–352 (2007)
Zhang, K, Liu, H, Liu, Z.: A new Dai-Liao conjugate gradient method with optimal parameter choice. Numer. Functional Anal. Optim. 40(2), 194–215 (2019)
Li, X., Wang, X., Sheng, Z., Duan, X.: A modified conjugate gradient algorithm with backtracking line search technique for large-scale nonlinear equations. Int. J. Comput. Math. 95(2), 382–395 (2018)
Andrei, N.: Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bullet. Malaysian Math. Society Series 2 (2), 319–330 (2011)
Dai, Y., Liao, L.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)
Andrei, N.: A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues. Numer. Alg. 77, 1273–1282 (2018)
Babaie-kafaki, S., Ghanbari, R.: The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)
Yuan, G., Wang, X., Sheng, Z.: The projection technique for two open problems of unconstrained optimization problems. J. Optim. Theory Appl. 186, 590–619 (2020)
Yuan, G., Meng, Z., Li, Y.: A modified hestenes and stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 168, 129–152 (2016)
Peyghami, M., Ahmadzadeh, H., Fazli, A.: A new class of efficient and globally convergent conjugate gradient methods in the Dai-Liao family. Optim. Methods Softw. 30(4), 843–863 (2015)
Yao, S., He, D., Shi, L.: An improved perry conjugate gradient method with adaptive parameter choice. Numer. Alg. 78, 1255–1269 (2018)
Zheng, Y., Zheng, B.: Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems. J. Optim. Theory Appl. 175(2), 502–509 (2017)
Hager, W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Dai, Y., Kou, C.: A nonlinear conjugate gradient algorithm with an optimal property and an improved wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Al-Baali, M., Narushima, Y., Yabe, H.: A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization. Comput. Optim. Appl. 60(1), 89–110 (2015)
Eslahchi, M., Bojari, S.: Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization. Optim. Methods Softw., pp. 1–14 (2020)
Narushima, Y., Yabe, H., Ford, J.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21(1), 212–230 (2011)
Zhang, L., Zhou, W., Li, D.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)
Andrei, N.: A simple three-term conjugate gradient algorithm for unconstrained optimization. J. Comput. Appl. Math. 241, 19–29 (2013)
Yao, S., Feng, Q., Li, L., Xu, J.: A class of globally convergent three-term Dai-Liao conjugate gradient methods. Appl. Numer. Math. 151, 354–366 (2020)
Tian, Q., Wang, X.L., Pang, L.P., Zhang, M.K., Meng, F.Y.: A new hybrid three-term conjugate gradient algorithm for large-scale unconstrained problems. Mathematics 9(12), 1–16 (2021)
Andrei, N.: Nonlinear conjugate gradient methods for unconstrained optimization. Springer Optim. Appl., vol. 158 (2020)
Wei, Z., Li, G., Qi, L.: New quasi-newton methods for unconstrained optimization problems. Appl. Math. Comput. 175(2), 1156–1188 (2006)
Yuan, G., Wei, Z.: Convergence analysis of a modified BFGS method on convex minimizations. Comput. Optim. Appl. 47(2), 237–255 (2010)
Li, D., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1-2), 15–35 (2001)
Babaie-Kafaki, S., Ghanbari, R.: A modified scaled conjugate gradient method with global convergence for nonconvex functions. Bullet. Belgian Math. Society Simon Stevin 21(3), 465–477 (2014)
Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Alg. 42(1), 63–73 (2006)
Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)
Dolan, E., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)
Watkins, S.: Fundamentals of matrix computations. Wiley, New York (2002)
Babaie-Kafaki, S.: A hybrid scaling parameter for the scaled memoryless BFGS method based on the \(\ell _{\infty }\) matrix norm. Int. J. Comput. Math. 96(8), 1595–1602 (2019)
Yu, G., Huang, J., Zhou, Y.: A descent spectral conjugate gradient method for impulse noise removal. Appl. Math. Lett. 23(5), 555–560 (2010)
Cai, J., Chan, R., Fiore, C.: Minimization of a detail-preserving regularization functional for impulse noise removal. J. Math. Imaging Vis. 29(1), 79–91 (2007)
Bovik, A.: Handbook of image and video processing, academic, New York (2000)
Acknowledgements
The authors are very much indebted and grateful to the editors and anonymous referees for their valuable comments and suggestions which improved the quality of this paper.
Funding
This work is supported by the Science Foundation of Zhejiang Sci-Tech University (ZSTU) under Grant No. 21062347-Y, the National Natural Science Foundation of China (Grant No. 11661009), the High Level Innovation Teams and Excellent Scholars Program in Guangxi institutions of higher education of China (Grant No. [2019]52), the Guangxi Natural Science Key Foundation (No. 2017GXNSFDA198046), the Special Funds for Local Science and Technology Development Guided by the Central Government (No. ZY20198003) and Special Foundation for Guangxi Ba Gui Scholars. This work is also supported by the National Natural Science Foundation of China (No. 11801503) and Zhejiang Provincial Natural Science Foundation of China (No. LY20A010025).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare no competing interests.
Additional information
Data availability
The datasets generated during the current study are available from the corresponding author on reasonable request.
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Wang, X., Yuan, G. & Pang, L. A class of new three-term descent conjugate gradient algorithms for large-scale unconstrained optimization and applications to image restoration problems. Numer Algor 93, 949–970 (2023). https://doi.org/10.1007/s11075-022-01448-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-022-01448-y
Keywords
- Conjugate gradient
- Nonconvex functions
- Sufficient descent property
- Global convergence
- Gradient Lipschitz continuity condition