Skip to main content
Log in

Another hybrid conjugate gradient algorithm for unconstrained optimization

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

Another hybrid conjugate gradient algorithm is subject to analysis. The parameter β k is computed as a convex combination of \( \beta ^{{HS}}_{k} \) (Hestenes-Stiefel) and \( \beta ^{{DY}}_{k} \) (Dai-Yuan) algorithms, i.e. \( \beta ^{C}_{k} = {\left( {1 - \theta _{k} } \right)}\beta ^{{HS}}_{k} + \theta _{k} \beta ^{{DY}}_{k} \). The parameter θ k in the convex combination is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Newton direction and the pair (s k , y k ) to satisfy the quasi-Newton equation \( \nabla ^{2} f{\left( {x_{{k + 1}} } \right)}s_{k} = y_{k} \), where \( s_{k} = x_{{k + 1}} - x_{k} \) and \( y_{k} = g_{{k + 1}} - g_{k} \). The algorithm uses the standard Wolfe line search conditions. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms the Hestenes-Stiefel and the Dai-Yuan conjugate gradient algorithms as well as the hybrid conjugate gradient algorithms of Dai and Yuan. A set of 750 unconstrained optimization problems are used, some of them from the CUTE library.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Andrei, N.: Test functions for unconstrained optimization. http://www.ici.ro/camo/neculai/SCALCG/evalfg.for

  2. Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38, 401–416 (2007)

    Article  MathSciNet  Google Scholar 

  3. Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22, 561–571 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  4. Andrei, N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20, 645–650 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  5. Andrei, N.: Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud. Inform. Control 16, 333–352 (2007)

    Google Scholar 

  6. Birgin, E., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43, 117–128 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  7. Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, P.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Soft. 21, 123–160 (1995)

    Article  MATH  Google Scholar 

  8. Dai, Y.H.: New properties of a nonlinear conjugate gradient method. Numer. Math. 89, 83–98 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  9. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  10. Dai, Y.H., Han, J.Y., Liu, G.H., Sun, D.F., Yin, X., Yuan, Y.: Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 10, 348–358 (1999)

    Article  MathSciNet  Google Scholar 

  11. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  12. Dai, Y.H., Yuan, Y.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103, 33–47 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  13. Dolan, E.D. Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  14. Fletcher, R.: Practical Methods of Optimization, vol. 1: Unconstrained Optimization. John Wiley & Sons, New York (1987)

    Google Scholar 

  15. Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MATH  MathSciNet  Google Scholar 

  16. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  17. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  18. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)

    MATH  Google Scholar 

  19. Hestenes, M.R., Stiefel, E.L.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)

    MATH  MathSciNet  Google Scholar 

  20. Hu, Y.F., Storey, C.: Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. 71, 399–405 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  21. Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, Part 1: Theory. J. Optim. Theory Appl. 69, 129–137 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  22. Polak, E., Ribière, G.: Note sur la convergence de directions conjuguée. Rev. Francaise Informat Recherche Operationelle, 3e Année 16, 35–43 (1969)

    Google Scholar 

  23. Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comp. Math. Math. Phys. 9, 94–112 (1969)

    Article  Google Scholar 

  24. Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In: Numerical Analysis (Dundee, 1983), Lecture Notes in Mathematics, vol. 1066, pp. 122–141, Springer-Verlag, Berlin (1984)

  25. Shanno, D.F., Phua, K.H.: Algorithm 500, Minimization of unconstrained multivariate functions. ACM Trans. Math. Soft. 2, 87–94 (1976)

    Article  MATH  Google Scholar 

  26. Touati-Ahmed, D., Storey, C.: Efficient hybrid conjugate gradient techniques. J. Optim. Theory Appl. 64, 379–397 (1990)

    Article  MATH  MathSciNet  Google Scholar 

  27. Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102, 147–167 (1999)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Neculai Andrei.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Andrei, N. Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer Algor 47, 143–156 (2008). https://doi.org/10.1007/s11075-007-9152-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-007-9152-9

Keywords

MSC

Navigation