Abstract
In this paper, by taking a little modification to the search direction of the MPRP method, a variant PRP conjugate gradient method is proposed, which can satisfies the sufficient descent condition independent of any line search. Also, we propose a general form of conjugate gradient parameter \(\beta _k\), and the corresponding method always generates a sufficient descent direction independent of the line search employed. We establish the global convergence of our methods without the assumption that the steplength is bounded away from zero. Numerical results illustrate that our methods can efficiently solve the test problems and therefore is promising.
Similar content being viewed by others
References
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Liu, Y.L., Storey, C.S.: Efficient generalized conjugate gradient algorithms. part 1: theory. J. Optim. Theory Appl. 69(1), 129–137 (1991)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)
Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)
Fletcher, R.: Practical methods of optimization. Wiley (2013)
Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Zhang, L., Zhou, W.J., Li, D.H.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)
Cheng, W.Y., Li, D.H.: An active set modified Polak-Ribière-Polyak method for large-scale nonlinear bound constrained optimization. J. Optim. Theory Appl. 155(3), 1084–1094 (2012)
Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21(1), 212–230 (2011)
Andrei, N.: Another conjugate gradient algorithm with guaranteed descent and the conjugacy conditions for large-scaled unconstrained optimization. J. Optim. Theory Appl. 159(3), 159–182 (2013)
Nakamura, W., Narushima, Y., Yabe, H.: Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization. J. Ind. Manag. Optim. 9, 595–619 (2013)
Yu, G., Guan, L., Li, G.: Global convergence of modiffied Polak-Ribière- Polyak conjugate gradient methods with sufficient descent property. J. Ind. Manag. Optim. 4, 565–579 (2008)
Zhang, L., Li, J.: A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization. Appl. Math. Comput. 217, 10295–10304 (2011)
Dong, X., Liu, H., He, Y., Yang, X.: A modified Hestenes–Stiefel conjugate gradient method with sufficient descent condition and conjugac condition. J. Comp. Appl. Math. (2015). doi:10.1016/j.cam.2014.11.058
Dong, X., Liu, H., He, Y.: A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. (2015).doi:10.1007/s10957-014-0601-z
Yuan, G.L.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3(1), 11–21 (2009)
Dai, Z.F., Tian, B.S.: Global convergence of some modified PRP nonlinear conjugate gradient methods. Optim. Lett. 5(4), 615–630 (2011)
Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)
Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser.A), 201–213 (2002)
Acknowledgments
We would like to thank Professors W. W. Hager and H. Zhang for their CG_DESCENT code for numerical comparison. This work is supported by the Natural Science Foundation of China (11361001), Doctor Foundation of Beifang University of Nationalities (2014XBZ09) and Fundamental Research Funds for the Central Universities (K50513100007).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Dong, X.L., Liu, H., Xu, Y.L. et al. Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence. Optim Lett 9, 1421–1432 (2015). https://doi.org/10.1007/s11590-014-0836-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-014-0836-5