Abstract
In this paper, an adaptive three-term conjugate gradient method is proposed for solving unconstrained problems, which generates sufficient descent directions at each iteration. Different from the existent methods, a dynamical adjustment between Hestenes–Stiefel and Dai–Liao conjugacy conditions in our proposed method is developed. Under mild condition, we show that the proposed method converges globally. Numerical experimentation with the new method indicates that it efficiently solves the test problems and therefore is promising.
Similar content being viewed by others
Notes
It should be stressed that only for CG_DESCENT method with 6.0 version, the implementations are run on Dell Precision T7500 with 96 GB memory and dual six-core Intel Xeon Processors (3.46 GZ) and only one core used. And the numerical results of CG_DESCENT 6.0 are posted at http://users.clas.ufl.edu/hager/papers/CG/results6.0.txt.
References
Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)
Fletcher, R.: Practical Methods of Optimization. Wiley, Hoboken (2013)
Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Liu, Y.L., Storey, C.S.: Efficient generalized conjugate gradient algorithms. Part 1: theory. J. Optim. Theory Appl. 69(1), 129–137 (1991)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Dai, Y.H.: Conjugate gradient methods with Armijo-type line searches. Acta Math. Appl. Sin. (English Series) 18(1), 123–130 (2002)
Dai, Y.H., Liu, X.W.: Advances in linear and nonlinear programming. Oper. Res. Trans. (Chin. Ser.) 18(1), 69–92 (2014)
Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26(6), 1073–1078 (1976)
Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)
Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102(1), 147–167 (1999)
Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202(2), 523–539 (2007)
Ford, J.A., Narushima, Y., Yabe, H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40(2), 191–216 (2008)
Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21(5), 707–714 (2006)
Babaie-Kafaki, S., Reza, G.: A descent family of Dai–Liao conjugate gradient methods. Optim. Methods Softw. 29(3), 583–591 (2014)
Babaie-Kafaki, S., Reza, G.: The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)
Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient methods with sufficient descent property. Optim. Lett. 8(8), 2285–2297 (2014)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Yuan, Y.X., Stoer, J.: A subspace study on conjugate gradient algorithms. ZAMM J. Appl. Math. Mech. 75(1), 69–77 (1995)
Dai, Y.H., Kou, C.X.: A Barzilai–Borwein conjugate gradient method. Sci. China Math. 59(8), 1511–1524 (2016)
Kou, C.X.: An improved nonlinear conjugate gradient method with an optimal property. Sci. China Math. 57(3), 635–648 (2014)
Kou, C.X., Dai, Y.H.: A modified self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. J. Optim. Theory Appl. 165(1), 209–224 (2015)
Dong, X., Liu, H., He, Y.: A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. 165(1), 225–241 (2015)
Dong, X., Han, D., Dai, Z., Li, X., Zhu, J.: An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. 179(3), 944–961 (2018)
Dong, X., Liu, H., He, Y.: A modified Hestenes–Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition. J. Comput. Appl. Math. 281, 239–249 (2015)
Zhang, L., Zhou, W.J., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)
Cheng, W.Y., Li, D.H.: An active set modified Polak-Ribière–Polyak method for large-scale nonlinear bound constrained optimization. J. Optim. Theory Appl. 155(3), 1084–1094 (2012)
Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21(1), 212–230 (2011)
Andrei, N.: A simple three-term conjugate gradient algorithm for unconstrained optimization. J. Comput. Appl. Math. 241, 19–29 (2013)
Andrei, N.: On three-term conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 219(11), 6316–6327 (2013)
Andrei, N.: Another conjugate gradient algorithm with guaranteed descent and the conjugacy conditions for large-scaled unconstrained optimization. J. Optim. Theory Appl. 159(3), 159–182 (2013)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)
Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)
Acknowledgements
We are grateful to the anonymous referees and editor for their useful comments, which have made the paper clearer and more comprehensive than the earlier version. We thank Professors W. W. Hager and H. Zhang for their CG_DESCENT code for numerical comparison.
Author information
Authors and Affiliations
Corresponding author
Additional information
This work was supported by First-Class Disciplines Foundation of Ningxia Hui Autonomous Region (No. NXYLXK2017B09), the National Natural Science Foundation of China (Nos. 11601012, 11861002, 71771030), the Key Project of North Minzu University (No. ZDZX201804), Natural Science Foundation of Ningxia Hui Autonomous Region (Nos. NZ17103, 2018AAC03253), Natural Science Foundation of Guangxi Zhuang Autonomous Region (No. 2018GXNSFAA138169), Guangxi Key Laboratory of Cryptography and Information Security (No. GCIS201708).
Rights and permissions
About this article
Cite this article
Dong, XL., Dai, ZF., Ghanbari, R. et al. An Adaptive Three-Term Conjugate Gradient Method with Sufficient Descent Condition and Conjugacy Condition. J. Oper. Res. Soc. China 9, 411–425 (2021). https://doi.org/10.1007/s40305-019-00257-w
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40305-019-00257-w
Keywords
- Three-term conjugate gradient method
- Sufficient descent condition
- Conjugacy condition
- Global convergence