Abstract
A family of the spectral Dai–Liao methods is put forward in which the Dai–Liao parameter is achieved by approaching its search direction matrix to the BFGS (Broyden–Fletcher–Goldfarb–Shanno) updating formula following the analysis conducted by Babaie–Kafaki and Ghanbari (4OR 15(1), 85–92, 2017). In addition, the spectral parameter is calculated in such a way as to guarantee the sufficient descent condition for the general functions. Convergence analyses are established for convex and nonconvex optimization problems. Eventually, numerical tests shed light on the efficiency of the performance of the proposed method.
Similar content being viewed by others
Data Availability
The authors confirm that the data supporting the findings of this study are available within the manuscript. Raw data that support the finding of this study are available from the corresponding author, upon reasonable request.
References
Aminifard, Z., Babaie–Kafaki, S.: Matrix analyses on the Dai–Liao conjugate gradient method. ANZIAM J. 61(2), 195–203 (2019)
Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)
Andrei, N.: A Dai–Liao conjugate gradient algorithm with clustering of eigenvalues. Numer. Algorithms 77(4), 1273–1282 (2018)
Andrei, N.: Nonlinear conjugate gradient methods for unconstrained optimization. Springer, Berlin (2020)
Aminifard, Z., Babaie–Kafaki, S.: An optimal parameter choice for the Dai–Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix. 4OR 17(3), 317–330 (2019)
Babaie–Kafaki, S., Ghanbari, R.: The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)
Babaie–Kafaki, S., Ghanbari, R.: Two optimal Dai–Liao conjugate gradient methods. Optimization 64(11), 2277–2287 (2015)
Babaie–Kafaki, S., Ghanbari, R.: A class of adaptive Dai–Liao conjugate gradient methods based on the scaled memoryless BFGS update. 4OR 15(1), 85–92 (2017)
Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)
Dai, Z.F.: Two modified HS type conjugate gradient methods for unconstrained optimization problems. Nonlinear Anal. Theory Methods Appl. 74(3), 927–936 (2011)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)
Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A), 201–213 (2002)
Gould, N.I.M., Orban, D., Toint, P.H.L.: CUTEr: A constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)
Khoshsimaye–Bargard, M., Ashrafi, A.: A new descent spectral Polak–Ribiè,re–Polyak method based on the memoryless BFGS update. Comput. Appl. Math. 40(8), 1–17 (2021)
Khoshsimaye–Bargard, M., Ashrafi, A.: A descent family of three–term conjugate gradient methods with global convergence for general functions. Pac. J. Optim., (Accepted)
Li, X., Shi, J., Dong, X., Yu, J.: A new conjugate gradient method based on Quasi–Newton equation for unconstrained optimization. J. Comput. Appl. Math. 350, 372–379 (2019)
Liu, J.K., Feng, Y.M., Zou, L.M.: A spectral conjugate gradient method for solving large–scale unconstrained optimization. Comput. Math. Appl. 77(3), 731–739 (2019)
Liu, J.K., Zhao, Y.X., Wu, X.L.: Some three–term conjugate gradient methods with the new direction structure. Appl. Numer. Math. 150, 433–443 (2020)
Lu, J., Yuan, G., Wang, Z.: A modified Dai–Liao conjugate gradient method for solving unconstrained optimization and image restoration problems. J. Appl. Math. Comput. 68(2), 681–703 (2022)
Nocedal, J., Wright, S.J.: Numerical optimization. Springer, New York (2006)
Oren, S. S., Luenberger, D.G.: Self–scaling variable metric (SSVM) algorithms: Part I: criteria and sufficient conditions for scaling a class of algorithm. Manag. Sci. 20(5), 845–862 (1974)
Oren, S.S., Spedicato, E.: Optimal conditioning of self–scaling variable metric algorithms. Math. Program. 10(1), 70–90 (1976)
Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In: Griffiths, D.F. (ed.) Numerical Analysis (Dundee, 1983), volume 1066 of Lecture Notes in Math, pp 122–141. Springer, Berlin (1984)
Powell, M.J.D.: Convergence properties of algorithms for nonlinear optimization. SIAM Rev. 28(4), 487–500 (1986)
Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three–term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)
Sun, W., Yuan, Y.X.: Optimization theory and methods: nonlinear programming. Springer, New York (2006)
Acknowledgements
This research was supported by the Research Council of Semnan University. The authors thank the anonymous reviewer for his/her valuable comments and suggestions that helped to improve the quality of this work. They are grateful to Professor Michael Navon for providing the line search code.
Author information
Authors and Affiliations
Contributions
The authors confirm contribution to the manuscript as follows: study conception and design: A. Ashrafi; convergence analysis: M. Khoshsimaye–Bargard; performing numerical tests and interpretation of results: M. Khoshsimaye–Bargard; draft manuscript preparation: M. Khoshsimaye–Bargard, A. Ashrafi. All authors reviewed the results and approved the final version of the manuscript.
Corresponding author
Ethics declarations
Consent for publication
The participants have consented to the submission of the manuscript to the Numerical Algorithms journal.
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Khoshsimaye–Bargard, M., Ashrafi, A. A descent spectral Dai–Liao method based on the quasi–Newton aspects. Numer Algor 94, 397–411 (2023). https://doi.org/10.1007/s11075-023-01506-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-023-01506-z
Keywords
- Nonlinear programming
- Spectral conjugate gradient method
- Quasi–Newton update
- Sufficient descent property
- Global convergence