Skip to main content
Log in

A descent spectral Dai–Liao method based on the quasi–Newton aspects

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

A family of the spectral Dai–Liao methods is put forward in which the Dai–Liao parameter is achieved by approaching its search direction matrix to the BFGS (Broyden–Fletcher–Goldfarb–Shanno) updating formula following the analysis conducted by Babaie–Kafaki and Ghanbari (4OR 15(1), 85–92, 2017). In addition, the spectral parameter is calculated in such a way as to guarantee the sufficient descent condition for the general functions. Convergence analyses are established for convex and nonconvex optimization problems. Eventually, numerical tests shed light on the efficiency of the performance of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Data Availability

The authors confirm that the data supporting the findings of this study are available within the manuscript. Raw data that support the finding of this study are available from the corresponding author, upon reasonable request.

References

  1. Aminifard, Z., Babaie–Kafaki, S.: Matrix analyses on the Dai–Liao conjugate gradient method. ANZIAM J. 61(2), 195–203 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  2. Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  3. Andrei, N.: A Dai–Liao conjugate gradient algorithm with clustering of eigenvalues. Numer. Algorithms 77(4), 1273–1282 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  4. Andrei, N.: Nonlinear conjugate gradient methods for unconstrained optimization. Springer, Berlin (2020)

    Book  MATH  Google Scholar 

  5. Aminifard, Z., Babaie–Kafaki, S.: An optimal parameter choice for the Dai–Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix. 4OR 17(3), 317–330 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  6. Babaie–Kafaki, S., Ghanbari, R.: The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  7. Babaie–Kafaki, S., Ghanbari, R.: Two optimal Dai–Liao conjugate gradient methods. Optimization 64(11), 2277–2287 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  8. Babaie–Kafaki, S., Ghanbari, R.: A class of adaptive Dai–Liao conjugate gradient methods based on the scaled memoryless BFGS update. 4OR 15(1), 85–92 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  9. Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  10. Dai, Z.F.: Two modified HS type conjugate gradient methods for unconstrained optimization problems. Nonlinear Anal. Theory Methods Appl. 74(3), 927–936 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  11. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  12. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  13. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  14. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A), 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  15. Gould, N.I.M., Orban, D., Toint, P.H.L.: CUTEr: A constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    Article  MATH  Google Scholar 

  16. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  17. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  18. Khoshsimaye–Bargard, M., Ashrafi, A.: A new descent spectral Polak–Ribiè,re–Polyak method based on the memoryless BFGS update. Comput. Appl. Math. 40(8), 1–17 (2021)

    Article  MATH  Google Scholar 

  19. Khoshsimaye–Bargard, M., Ashrafi, A.: A descent family of three–term conjugate gradient methods with global convergence for general functions. Pac. J. Optim., (Accepted)

  20. Li, X., Shi, J., Dong, X., Yu, J.: A new conjugate gradient method based on Quasi–Newton equation for unconstrained optimization. J. Comput. Appl. Math. 350, 372–379 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  21. Liu, J.K., Feng, Y.M., Zou, L.M.: A spectral conjugate gradient method for solving large–scale unconstrained optimization. Comput. Math. Appl. 77(3), 731–739 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  22. Liu, J.K., Zhao, Y.X., Wu, X.L.: Some three–term conjugate gradient methods with the new direction structure. Appl. Numer. Math. 150, 433–443 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  23. Lu, J., Yuan, G., Wang, Z.: A modified Dai–Liao conjugate gradient method for solving unconstrained optimization and image restoration problems. J. Appl. Math. Comput. 68(2), 681–703 (2022)

    Article  MathSciNet  MATH  Google Scholar 

  24. Nocedal, J., Wright, S.J.: Numerical optimization. Springer, New York (2006)

    MATH  Google Scholar 

  25. Oren, S. S., Luenberger, D.G.: Self–scaling variable metric (SSVM) algorithms: Part I: criteria and sufficient conditions for scaling a class of algorithm. Manag. Sci. 20(5), 845–862 (1974)

    Article  MATH  Google Scholar 

  26. Oren, S.S., Spedicato, E.: Optimal conditioning of self–scaling variable metric algorithms. Math. Program. 10(1), 70–90 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  27. Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In: Griffiths, D.F. (ed.) Numerical Analysis (Dundee, 1983), volume 1066 of Lecture Notes in Math, pp 122–141. Springer, Berlin (1984)

  28. Powell, M.J.D.: Convergence properties of algorithms for nonlinear optimization. SIAM Rev. 28(4), 487–500 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  29. Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three–term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  30. Sun, W., Yuan, Y.X.: Optimization theory and methods: nonlinear programming. Springer, New York (2006)

    MATH  Google Scholar 

Download references

Acknowledgements

This research was supported by the Research Council of Semnan University. The authors thank the anonymous reviewer for his/her valuable comments and suggestions that helped to improve the quality of this work. They are grateful to Professor Michael Navon for providing the line search code.

Author information

Authors and Affiliations

Authors

Contributions

The authors confirm contribution to the manuscript as follows: study conception and design: A. Ashrafi; convergence analysis: M. Khoshsimaye–Bargard; performing numerical tests and interpretation of results: M. Khoshsimaye–Bargard; draft manuscript preparation: M. Khoshsimaye–Bargard, A. Ashrafi. All authors reviewed the results and approved the final version of the manuscript.

Corresponding author

Correspondence to Ali Ashrafi.

Ethics declarations

Consent for publication

The participants have consented to the submission of the manuscript to the Numerical Algorithms journal.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Khoshsimaye–Bargard, M., Ashrafi, A. A descent spectral Dai–Liao method based on the quasi–Newton aspects. Numer Algor 94, 397–411 (2023). https://doi.org/10.1007/s11075-023-01506-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-023-01506-z

Keywords

Mathematics Subject Classification (2010)

Navigation