Abstract
We suggest a linear combination of search directions of the Hestenes–Stiefel method and the memoryless BFGS (Broyden–Fletcher–Goldfarb–Shanno) technique. As a result, a one-parameter extension of the Hestenes–Stiefel method is proposed. Based on an eigenvalue analysis, we show that the method may ensure the descent property. In a least-squares scheme, parameter of the method is determined in a way to tend the search direction of the method to the search direction of the three-term conjugate gradient method proposed by Zhang et al. which satisfies the sufficient descent condition. We conduct a brief global convergence analysis for the proposed method under the Wolfe line search conditions. Comparative numerical experiments are done on a set of the CUTEr test problems and the detailed results are reported. They show practical efficiency of the proposed method.
Similar content being viewed by others
References
Al-Baali, M.: Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA J. Numer. Anal. 5(1), 121–124 (1985)
Andrei, N.: Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud. Inform. Control 16(4), 333–352 (2007)
Andrei, N.: Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization. Numer. Algorithms 54(1), 23–46 (2010)
Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)
Babaie-Kafaki, S.: On the sufficient descent property of the Shanno’s conjugate gradient method. Optim. Lett. 7(4), 831–837 (2013)
Babaie-Kafaki, S.: On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae. J. Optim. Theory Appl. 167(1), 91–101 (2015)
Babaie-Kafaki, S.: A modified scaling parameter for the memoryless BFGS updating formula. Numer. Algorithms 72(2), 425–433 (2016)
Babaie-Kafaki, S., Ghanbari, R.: The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)
Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient methods with sufficient descent property. Optim. Lett. 8(8), 2285–2297 (2014)
Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach. Optim. Methods Softw. 30(4), 673–681 (2015)
Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234(5), 1374–1386 (2010)
Dai, Y.H.: New properties of a nonlinear conjugate gradient method. Numer. Math. 89(1), 83–98 (2001)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)
Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A):201–213 (2002)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Gould, N., Scott, J.: A note on performance profiles for benchmarking software. ACM Trans. Math. Softw. 43(2):Art. 15, 5 (2016)
Gould, N.I.M., Orban, D., Toint, PhL: CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Hager, W.W., Zhang, H.: Algorithm 851: CG\(_{-}\)Descent, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49(6), 409–436 (1952)
Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1–2), 15–35 (2001)
Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In Griffiths, D.F., (Ed.), Numerical Analysis (Dundee, 1983), volume 1066 of Lecture Notes in Math., pp. 122–141. Springer, Berlin (1984)
Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)
Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006)
Xu, C., Zhang, J.Z.: A survey of quasi-Newton equations and quasi-Newton methods for optimization. Ann. Oper. Res. 103(1–4), 213–234 (2001)
Zhang, L., Zhou, W., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)
Acknowledgements
This research was supported by Research Councils of Semnan University and Ferdowsi University of Mashhad. The authors thank the anonymous reviewer for his/her valuable comments helped to improve the presentation.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Babaie-Kafaki, S., Ghanbari, R. A Linear Hybridization of the Hestenes–Stiefel Method and the Memoryless BFGS Technique. Mediterr. J. Math. 15, 86 (2018). https://doi.org/10.1007/s00009-018-1132-x
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s00009-018-1132-x