Abstract
A class of two–parameter scaled memoryless BFGS methods is developed for solving unconstrained optimization problems. Then, the scaling parameters are determined in a way to improve the condition number of the corresponding memoryless BFGS update. It is shown that for uniformly convex objective functions, search directions of the method satisfy the sufficient descent condition which leads to the global convergence. To achieve convergence for general functions, a revised version of the method is developed based on the Li–Fukushima modified secant equation. To enhance performance of the methods, a nonmonotone scheme for computing the initial value of the step length is suggested to be used in the line search procedure. Numerical experiments are done on a set of unconstrained optimization test problems of the CUTEr collection. They show efficiency of the proposed algorithms in the sense of the Dolan–Moré performance profile.
Similar content being viewed by others
References
Andrei, N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20(6), 645–650 (2007)
Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22(4), 561–571 (2007)
Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 47(2), 143–156 (2008)
Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)
Andrei, N.: An adaptive scaled BFGS method for unconstrained optimization. Numer. Algorithms 77(2), 413–432 (2018)
Andrei, N.: A double parameter scaled BFGS method for unconstrained optimization. J. Comput. Appl. Math. 332, 26–44 (2018)
Babaie–Kafaki, S.: A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization. 4OR 11(4), 361–374 (2013)
Babaie–Kafaki, S.: Two modified scaled nonlinear conjugate gradient methods. J. Comput. Appl. Math. 261(5), 172–182 (2014)
Babaie–Kafaki, S.: On optimality of the parameters of self–scaling memoryless quasi–Newton updating formulae. J. Optim. Theory Appl. 167(1), 91–101 (2015)
Babaie–Kafaki, S.: A modified scaling parameter for the memoryless BFGS updating formula. Numer. Algorithms 72(2), 425–433 (2016)
Babaie–Kafaki, S., Fatemi, M., Mahdavi–Amiri, N.: Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numer. Algorithms 58(3), 315–331 (2011)
Babaie–Kafaki, S., Ghanbari, R.: A modified scaled conjugate gradient method with global convergence for nonconvex functions. Bull. Belg. Math. Soc. Simon Stevin 21(3), 465–477 (2014)
Birgin, E., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A), 201–213 (2002)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Gould, N.I.M., Orban, D., Toint, P.L.: CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)
Gould, N.I.M., Scott, J.: A note on performance profiles for benchmarking software. ACM Trans. Math. Softw. 43(2), Art. 15, 5 (2016)
Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986)
Hager, W.W., Zhang, H.: Algorithm 851: CG_Descent, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)
Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1–2), 15–35 (2001)
Liao, A.: Modifying the BFGS method. Oper. Res. Lett. 20(4), 171–177 (1997)
Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2006)
Oren, S.S., Spedicato, E.: Optimal conditioning of self–scaling variable metric algorithms. Math. Program. 10(1), 70–90 (1976)
Ou, Y.: A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods. J. Comput. Appl. Math. 332, 101–106 (2018)
Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three–term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)
Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006)
Watkins, D.S.: Fundamentals of Matrix Computations. Wiley, New York (2002)
Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21(5), 707–714 (2006)
Acknowledgements
This research was in part supported by the grant 96013024 from Iran National Science Foundation (INSF), and in part by the Research Council of Semnan University. The authors thank the anonymous reviewers for their valuable comments and suggestions helped to improve the quality of this work. They are also grateful to Professor Michael Navon for providing the line search code.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Babaie–Kafaki, S., Aminifard, Z. Two–parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length. Numer Algor 82, 1345–1357 (2019). https://doi.org/10.1007/s11075-019-00658-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-019-00658-1