The nonlinear conjugate gradient method (CGM) is a very effective way in solving large-scale optimal problems. In this paper, a modification to the Dai–Yuan (DY) nonlinear CGM is discussed, and then a sufficient descent CGM for unconstrained optimization is proposed. Unlike the DY CGM, at each iteration, the presented CGM always generates a sufficient descent direction depending on no line search. Under usual assumptions, the modified DY CGM with the Wolfe line search is proved to possess global convergence. Moreover, the idea is further extended to the Fletcher–Reeves (FR) CGM. Finally, a large amount of numerical experiments are executed and reported, which show that the proposed methods are effective.
Unconstrained optimization Conjugate gradient method Sufficient descent property Global convergence
This is a preview of subscription content, log in to check access.
The author wishes to thank the reviewers for their constructive and pertinent suggestions for improving the presentation of the work. This work is supported by the National Natural Science Foundation of China (Grant No. 11271086), the Guangxi Natural Science Foundation of China (Grant No. 2011GXNSFD018002) and Innovation Group of Talents Highland of Guang Xi higher School.
Balaram, B., Narayanan, M.D., Rajendrakumar, P.K.: Optimal design of multi-parametric nonlinear systems using a parametric continuation based genetic algorithm approach. Nonlinear Dyn. 67(2), 1669–1681 (2012)