Two effective hybrid conjugate gradient algorithms based on modified BFGS updates

Abstract

Based on two modified secant equations proposed by Yuan, and Li and Fukushima, we extend the approach proposed by Andrei, and introduce two hybrid conjugate gradient methods for unconstrained optimization problems. Our methods are hybridizations of Hestenes-Stiefel and Dai-Yuan conjugate gradient methods. Under proper conditions, we show that one of the proposed algorithms is globally convergent for uniformly convex functions and the other is globally convergent for general functions. To enhance the performance of the line search procedure, we propose a new approach for computing the initial value of the steplength for initiating the line search procedure. We give a comparison of the implementations of our algorithms with two efficiently representative hybrid conjugate gradient methods proposed by Andrei using unconstrained optimization test problems from the CUTEr collection. Numerical results show that, in the sense of the performance profile introduced by Dolan and Moré, the proposed hybrid algorithms are competitive, and in some cases more efficient.

This is a preview of subscription content, access via your institution.

References

  1. 1.

    Al-Baali, A.: Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA J. Numer. Anal. 5, 121–124 (1985)

    MathSciNet  MATH  Article  Google Scholar 

  2. 2.

    Andrei, N.: Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud. Inform. Control 16, 333–352 (2007)

    Google Scholar 

  3. 3.

    Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algor. 47, 143–156 (2008)

    MathSciNet  MATH  Article  Google Scholar 

  4. 4.

    Andrei, N.: Hybrid conjugate gradient algorithm for unconstrained optimization. J. Optim. Theory Appl. 141, 249–264 (2009)

    MathSciNet  MATH  Article  Google Scholar 

  5. 5.

    Andrei, N.: Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization. Numer. Algor. 54(1), 23–46 (2010)

    MathSciNet  MATH  Article  Google Scholar 

  6. 6.

    Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)

    MathSciNet  MATH  Article  Google Scholar 

  7. 7.

    Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43, 117–128 (2001)

    MathSciNet  MATH  Article  Google Scholar 

  8. 8.

    Dai, Y.H.: New properties of a nonlinear conjugate gradient method. Numer. Math. 89, 83–98 (2001)

    MathSciNet  MATH  Article  Google Scholar 

  9. 9.

    Dai, Y.H., Han, J.Y., Liu, G.H., Sun, D.F., Yin, H.X., Yuan, Y.X.: Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 10(2), 348–358 (1999)

    MathSciNet  Google Scholar 

  10. 10.

    Dai, Y.H., Yuan, Y.X.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103, 33–47 (2001)

    MathSciNet  MATH  Article  Google Scholar 

  11. 11.

    Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)

    MathSciNet  MATH  Article  Google Scholar 

  12. 12.

    Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    MathSciNet  MATH  Article  Google Scholar 

  13. 13.

    Fletcher, R., Revees, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    MathSciNet  MATH  Article  Google Scholar 

  14. 14.

    Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)

    MathSciNet  MATH  Article  Google Scholar 

  15. 15.

    Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEr, a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29, 373–394 (2003)

    MathSciNet  MATH  Article  Google Scholar 

  16. 16.

    Guo, Q., Liu, J.G. Wang, D.H.: A modified BFGS method and its superlinear convergence in nonconvex minimization with general line search rule. J. Appl. Math. Comput. 29, 435–446 (2008)

    MathSciNet  Article  Google Scholar 

  17. 17.

    Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2, 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  18. 18.

    Hestenes, M.R., Stiefel, E.L.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(5), 409–436 (1952)

    MathSciNet  MATH  Google Scholar 

  19. 19.

    Hu, Y.F., Storey, C.: Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. 71, 399–405 (1991)

    MathSciNet  MATH  Article  Google Scholar 

  20. 20.

    Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129, 15–35 (2001)

    MathSciNet  MATH  Article  Google Scholar 

  21. 21.

    Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202(2), 523–539 (2007)

    MathSciNet  MATH  Article  Google Scholar 

  22. 22.

    Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 129–137 (1991)

    MathSciNet  MATH  Article  Google Scholar 

  23. 23.

    Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer (2006)

  24. 24.

    Polak, E., Ribière, G.: Note sur la convergence de méthodes directions conjugées, Rev. Francaise êInform. Rech. Opér. 16, 35–43 (1969)

    Google Scholar 

  25. 25.

    Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comp. Math. Math. Phys. 9, 94–112 (1969)

    Article  Google Scholar 

  26. 26.

    Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In: Lecture Notes in Mathematics, vol. 1066, pp. 122–141. Springer, Berlin (1984)

    Google Scholar 

  27. 27.

    Touati-Ahmed, D., Storey, C.: Efficient hybrid conjugate gradient techniques. J. Optim. Theory Appl. 64, 379–397 (1990)

    MathSciNet  MATH  Article  Google Scholar 

  28. 28.

    Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer (2006)

  29. 29.

    Xiao, Y., Wang, Q., Wang, D.: Notes on the Dai-Yuan-Yuan modified spectral gradient method. J. Comput. Appl. Math. 234(10), 2986–2992 (2010)

    MathSciNet  MATH  Article  Google Scholar 

  30. 30.

    Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11, 325–332 (1991)

    MathSciNet  MATH  Article  Google Scholar 

  31. 31.

    Yuan, Y.X., Byrd, R.H.: Non-quasi-Newton updates for unconstrained optimization. J. Comput. Math. 13(2), 95–107 (1995)

    MathSciNet  MATH  Google Scholar 

  32. 32.

    Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102, 147–167 (1999)

    MathSciNet  MATH  Article  Google Scholar 

  33. 33.

    Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21(5) 707–714 (2006)

    MathSciNet  MATH  Article  Google Scholar 

  34. 34.

    Zoutendijk, G.: Nonlinear programming computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam(1970)

    Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Nezam Mahdavi-Amiri.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Babaie-Kafaki, S., Fatemi, M. & Mahdavi-Amiri, N. Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numer Algor 58, 315–331 (2011). https://doi.org/10.1007/s11075-011-9457-6

Download citation

Keywords

  • Unconstrained optimization
  • Hybrid conjugate gradient algorithm
  • Modified BFGS method
  • Global convergence