Skip to main content

A Linear Hybridization of the Hestenes–Stiefel Method and the Memoryless BFGS Technique

Abstract

We suggest a linear combination of search directions of the Hestenes–Stiefel method and the memoryless BFGS (Broyden–Fletcher–Goldfarb–Shanno) technique. As a result, a one-parameter extension of the Hestenes–Stiefel method is proposed. Based on an eigenvalue analysis, we show that the method may ensure the descent property. In a least-squares scheme, parameter of the method is determined in a way to tend the search direction of the method to the search direction of the three-term conjugate gradient method proposed by Zhang et al. which satisfies the sufficient descent condition. We conduct a brief global convergence analysis for the proposed method under the Wolfe line search conditions. Comparative numerical experiments are done on a set of the CUTEr test problems and the detailed results are reported. They show practical efficiency of the proposed method.

This is a preview of subscription content, access via your institution.

References

  1. 1.

    Al-Baali, M.: Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA J. Numer. Anal. 5(1), 121–124 (1985)

    MathSciNet  Article  MATH  Google Scholar 

  2. 2.

    Andrei, N.: Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud. Inform. Control 16(4), 333–352 (2007)

    Google Scholar 

  3. 3.

    Andrei, N.: Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization. Numer. Algorithms 54(1), 23–46 (2010)

    MathSciNet  Article  MATH  Google Scholar 

  4. 4.

    Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)

    MathSciNet  Article  MATH  Google Scholar 

  5. 5.

    Babaie-Kafaki, S.: On the sufficient descent property of the Shanno’s conjugate gradient method. Optim. Lett. 7(4), 831–837 (2013)

    MathSciNet  Article  MATH  Google Scholar 

  6. 6.

    Babaie-Kafaki, S.: On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae. J. Optim. Theory Appl. 167(1), 91–101 (2015)

    MathSciNet  Article  MATH  Google Scholar 

  7. 7.

    Babaie-Kafaki, S.: A modified scaling parameter for the memoryless BFGS updating formula. Numer. Algorithms 72(2), 425–433 (2016)

    MathSciNet  Article  MATH  Google Scholar 

  8. 8.

    Babaie-Kafaki, S., Ghanbari, R.: The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)

    MathSciNet  Article  MATH  Google Scholar 

  9. 9.

    Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient methods with sufficient descent property. Optim. Lett. 8(8), 2285–2297 (2014)

    MathSciNet  Article  MATH  Google Scholar 

  10. 10.

    Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach. Optim. Methods Softw. 30(4), 673–681 (2015)

    MathSciNet  Article  MATH  Google Scholar 

  11. 11.

    Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234(5), 1374–1386 (2010)

    MathSciNet  Article  MATH  Google Scholar 

  12. 12.

    Dai, Y.H.: New properties of a nonlinear conjugate gradient method. Numer. Math. 89(1), 83–98 (2001)

    MathSciNet  Article  MATH  Google Scholar 

  13. 13.

    Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    MathSciNet  Article  MATH  Google Scholar 

  14. 14.

    Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)

    MathSciNet  Article  MATH  Google Scholar 

  15. 15.

    Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    MathSciNet  Article  MATH  Google Scholar 

  16. 16.

    Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A):201–213 (2002)

  17. 17.

    Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    MathSciNet  Article  MATH  Google Scholar 

  18. 18.

    Gould, N., Scott, J.: A note on performance profiles for benchmarking software. ACM Trans. Math. Softw. 43(2):Art. 15, 5 (2016)

  19. 19.

    Gould, N.I.M., Orban, D., Toint, PhL: CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    Article  MATH  Google Scholar 

  20. 20.

    Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    MathSciNet  Article  MATH  Google Scholar 

  21. 21.

    Hager, W.W., Zhang, H.: Algorithm 851: CG\(_{-}\)Descent, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)

    MathSciNet  Article  MATH  Google Scholar 

  22. 22.

    Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  23. 23.

    Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49(6), 409–436 (1952)

    MathSciNet  Article  MATH  Google Scholar 

  24. 24.

    Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1–2), 15–35 (2001)

    MathSciNet  Article  MATH  Google Scholar 

  25. 25.

    Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In Griffiths, D.F., (Ed.), Numerical Analysis (Dundee, 1983), volume 1066 of Lecture Notes in Math., pp. 122–141. Springer, Berlin (1984)

  26. 26.

    Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)

    MathSciNet  Article  MATH  Google Scholar 

  27. 27.

    Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006)

    MATH  Google Scholar 

  28. 28.

    Xu, C., Zhang, J.Z.: A survey of quasi-Newton equations and quasi-Newton methods for optimization. Ann. Oper. Res. 103(1–4), 213–234 (2001)

    MathSciNet  Article  MATH  Google Scholar 

  29. 29.

    Zhang, L., Zhou, W., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)

    MathSciNet  Article  MATH  Google Scholar 

Download references

Acknowledgements

This research was supported by Research Councils of Semnan University and Ferdowsi University of Mashhad. The authors thank the anonymous reviewer for his/her valuable comments helped to improve the presentation.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Saman Babaie-Kafaki.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Babaie-Kafaki, S., Ghanbari, R. A Linear Hybridization of the Hestenes–Stiefel Method and the Memoryless BFGS Technique. Mediterr. J. Math. 15, 86 (2018). https://doi.org/10.1007/s00009-018-1132-x

Download citation

Mathematics Subject Classification

  • 90C53
  • 65K05
  • 65F35

Keywords

  • Nonlinear programming
  • unconstrained optimization
  • conjugate gradient method
  • memoryless BFGS method
  • global convergence