The paper is aimed to employ a modified secant equation in the framework of the hybrid conjugate gradient method based on Andrei’s approach to solve large-scale unconstrained optimization problems. The CG parameter in the mentioned hybrid CG method is a convex combination of CG parameters corresponding to the Hestenes–Stiefel and Dai–Yuan algorithms. The main feature of these hybrid methods is that the search direction is the Newton direction. The modified secant equation is derived by means of the fifth-order tensor model to improve the curvature information of the objective function. Also, to achieve convergence for general function, the revised version of the method based on the linear combination of the mentioned secant equation and Li and Fukushima’s modified secant equation is suggested. Under proper conditions, globally convergence properties of the new hybrid CG algorithm even without convexity assumption on the objective function is studied. Numerical experiments on a set of test problems of the CUTEr collection are done; they demonstrate the practical effectiveness of the proposed hybrid conjugate gradient algorithm.
Fletcher R, Revees CM (1964) On-line algorithms for computing mean and variance of interval data, and their use in intelligent systems. Comput J 7:149–154MathSciNetCrossRefGoogle Scholar
Gould NIM, Orban D, Toint Ph L (2003) CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw 29(4):373–394CrossRefGoogle Scholar
Guo Q, Liu JG, Wang DH (2008) A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule. J Appl Math Comput 28:435–446MathSciNetCrossRefGoogle Scholar