Andrei, N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20(6), 645–650 (2007)
MathSciNet
Article
Google Scholar
Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22(4), 561–571 (2007)
MathSciNet
Article
Google Scholar
Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 47(2), 143–156 (2008)
MathSciNet
Article
Google Scholar
Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)
MathSciNet
Article
Google Scholar
Andrei, N.: An adaptive scaled BFGS method for unconstrained optimization. Numer. Algorithms 77(2), 413–432 (2018)
MathSciNet
Article
Google Scholar
Andrei, N.: A double parameter scaled BFGS method for unconstrained optimization. J. Comput. Appl. Math. 332, 26–44 (2018)
MathSciNet
Article
Google Scholar
Babaie–Kafaki, S.: A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization. 4OR 11(4), 361–374 (2013)
MathSciNet
Article
Google Scholar
Babaie–Kafaki, S.: Two modified scaled nonlinear conjugate gradient methods. J. Comput. Appl. Math. 261(5), 172–182 (2014)
MathSciNet
Article
Google Scholar
Babaie–Kafaki, S.: On optimality of the parameters of self–scaling memoryless quasi–Newton updating formulae. J. Optim. Theory Appl. 167(1), 91–101 (2015)
MathSciNet
Article
Google Scholar
Babaie–Kafaki, S.: A modified scaling parameter for the memoryless BFGS updating formula. Numer. Algorithms 72(2), 425–433 (2016)
MathSciNet
Article
Google Scholar
Babaie–Kafaki, S., Fatemi, M., Mahdavi–Amiri, N.: Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numer. Algorithms 58(3), 315–331 (2011)
MathSciNet
Article
Google Scholar
Babaie–Kafaki, S., Ghanbari, R.: A modified scaled conjugate gradient method with global convergence for nonconvex functions. Bull. Belg. Math. Soc. Simon Stevin 21(3), 465–477 (2014)
MathSciNet
MATH
Google Scholar
Birgin, E., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)
MathSciNet
Article
Google Scholar
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A), 201–213 (2002)
MathSciNet
Article
Google Scholar
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
MathSciNet
Article
Google Scholar
Gould, N.I.M., Orban, D., Toint, P.L.: CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)
Article
Google Scholar
Gould, N.I.M., Scott, J.: A note on performance profiles for benchmarking software. ACM Trans. Math. Softw. 43(2), Art. 15, 5 (2016)
MathSciNet
Article
Google Scholar
Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986)
MathSciNet
Article
Google Scholar
Hager, W.W., Zhang, H.: Algorithm 851: CG_Descent, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)
Article
Google Scholar
Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1–2), 15–35 (2001)
MathSciNet
Article
Google Scholar
Liao, A.: Modifying the BFGS method. Oper. Res. Lett. 20(4), 171–177 (1997)
MathSciNet
Article
Google Scholar
Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2006)
MATH
Google Scholar
Oren, S.S., Spedicato, E.: Optimal conditioning of self–scaling variable metric algorithms. Math. Program. 10(1), 70–90 (1976)
MathSciNet
Article
Google Scholar
Ou, Y.: A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods. J. Comput. Appl. Math. 332, 101–106 (2018)
MathSciNet
Article
Google Scholar
Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three–term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)
MathSciNet
Article
Google Scholar
Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006)
MATH
Google Scholar
Watkins, D.S.: Fundamentals of Matrix Computations. Wiley, New York (2002)
Book
Google Scholar
Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21(5), 707–714 (2006)
MathSciNet
Article
Google Scholar