Skip to main content
Log in

A family of the modified three-term Hestenes–Stiefel conjugate gradient method with sufficient descent and conjugacy conditions

  • Original Research
  • Published:
Journal of Applied Mathematics and Computing Aims and scope Submit manuscript

Abstract

To strengthen the three-term Hestenes–Stiefel conjugate gradient method proposed by Zhang et al., we suggest a modified version of it. For this purpose, by considering the Dai–Liao approach, the third term of Zhang et al. method is multiplied by a positive parameter which can be determined adaptively. To render an appropriate choice for the parameter of the search direction, we carry out a matrix analysis by which the sufficient descent property of the method is guaranteed. In the following, convergence analyses are discussed for convex and nonconvex cost functions. Eventually, numerical tests shed light on the efficiency of the performance of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  1. Amini, K., Faramarzi, P., Pirfalah, N.: A modified Hestenes-Stiefel conjugate gradient method with an optimal property. Optim. Methods Softw. 34(4), 770–782 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  2. Aminifard, Z., Babaie-Kafaki, S.: Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing. Numer. Algorithms 89(3), 1369–1387 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  3. Aminifard, Z., Babaie-Kafaki, S.: Matrix analyses on the Dai-Liao conjugate gradient method. ANZIAM J. 61(2), 195–203 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  4. Andrei, N.: An adaptive conjugate gradient algorithm for large-scale unconstrained optimization. J. Comput. Appl. Math. 292, 83–91 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  5. Andrei, N.: A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues. Numer. Algorithms 77(4), 1273–1282 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  6. Babaie-Kafaki, S., Ghanbari, R.: A descent family of Dai-Liao conjugate gradient methods. Optim. Methods Softw. 29(3), 583–591 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  7. Babaie-Kafaki, S., Ghanbari, R.: The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  8. Babaie-Kafaki, S., Ghanbari, R.: A class of adaptive Dai–Liao conjugate gradient methods based on the scaled memoryless BFGS update. 4OR 15(1), 85–92 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  9. Beale, E.M.L.: A derivation of conjugate gradients. In: numerical Methods for nonlinear Optimization. Numer. Algorithms 42(1), 63–73 (1972)

    MathSciNet  Google Scholar 

  10. Bojari, S., Eslahchi, M.R.: Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization. Numer. Algorithms 83(3), 901–933 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  11. Bojari, S., Eslahchi, M.R.: A five-parameter class of derivative-free spectral conjugate gradient methods for systems of large-scale nonlinear monotone equations. Int. J. Comput. Methods (2022)

  12. Cao, J., Wu, J.: A conjugate gradient algorithm and its applications in image restoration. Appl. Numer. Math. 152, 243–252 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  13. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  14. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A), 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  15. Dong, X.L., Liu, H.W., He, Y.B., Yang, X.M.: A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition. J. Comput. Appl. Math. 281, 239–249 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  16. Eslahchi, M.R., Bojari, S.: Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization. Optim. Method. Softw. 37(3), 830–843 (2022)

    Article  MathSciNet  MATH  Google Scholar 

  17. Exl, L., Fischbacher, J., Kovacs, A., Oezelt, H., Gusenbauer, M., Schrefl, T.: Preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization. Comput. Phys. Comm. 235, 179–186 (2019)

    Article  MATH  Google Scholar 

  18. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  19. Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    Article  MATH  Google Scholar 

  20. Hager, W.W., Zhang, H.: Algorithm 851: CG-descent, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  21. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  22. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  23. Khoshsimaye-Bargard, M., Ashrafi, A.: A new descent spectral Polak-Ribière-Polyak method based on the memoryless BFGS update. Comput. Appl. Math. 40(8), 1–17 (2021)

    Article  MATH  Google Scholar 

  24. Li, L., Xie, X., Gao, T., Wang, J.: A modified conjugate gradient-based Elman neural network. Cogn. Syst. Res. 68, 62–72 (2021)

    Article  Google Scholar 

  25. Nocedal, J., Wright, S.J.: Numerical optimization. Springer, New York (2006)

    MATH  Google Scholar 

  26. Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26(6), 1073–1078 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  27. Powell, M.J.D: Nonconvex minimization calculations and the conjugate gradient method. In: Numerical Analysis (Dundee, 1983), Lecture Notes in Mathematics, vol. 1066, pp. 122–141. Springer, Berlin (1984)

  28. Powell, M.J.D.: Convergence properties of algorithms for nonlinear optimization. SIAM Rev. 28(4), 487–500 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  29. Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  30. Sun, W., Yuan, Y.X.: Optimization theory and methods: nonlinear programming. Springer, New York (2006)

    MATH  Google Scholar 

  31. Xue, W., Wan, P., Li, Q., Zhong, P., Yu, G., Tao, T.: An online conjugate gradient algorithm for large-scale data analysis in machine learning. AIMS Math. 6(2), 1515–1537 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  32. Yao, S., Feng, Q., Li, L., Xu, J.: A class of globally convergent three-term dai-liao conjugate gradient methods. Appl. Numer. Math. 151, 354–366 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  33. Zhang, L., Zhou, W., Li, D.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Method. Softw. 22(4), 697–711 (2007)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This research was supported by the Research Council of Semnan University. The authors thank the anonymous Reviewers for their valuable comments and suggestions that helped to improve the quality of this work. They are grateful to Professor Michael Navon for providing the line search code.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ali Ashrafi.

Ethics declarations

Conflict of interest

There are no relevant financial or non-financial competing interests to report.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Khoshsimaye-Bargard, M., Ashrafi, A. A family of the modified three-term Hestenes–Stiefel conjugate gradient method with sufficient descent and conjugacy conditions. J. Appl. Math. Comput. 69, 2331–2360 (2023). https://doi.org/10.1007/s12190-023-01839-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12190-023-01839-x

Keywords

Mathematics Subject Classification

Navigation