Skip to main content
Log in

On Optimality of the Parameters of Self-Scaling Memoryless Quasi-Newton Updating Formulae

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

Based on eigenvalue analyses, well-structured upper bounds for the condition number of the scaled memoryless quasi-Newton updating formulae Broyden–Fletcher–Goldfarb–Shanno and Davidon–Fletcher–Powell are obtained. Then, it is shown that the scaling parameter proposed by Oren and Spedicato is the unique minimizer of the given upper bound for the condition number of scaled memoryless Broyden–Fletcher–Goldfarb–Shanno update, and the scaling parameter proposed by Oren and Luenberger is the unique minimizer of the given upper bound for the condition number of scaled memoryless Davidon–Fletcher–Powell update. Thus, scaling parameters proposed by Oren et al. may enhance numerical stability of the self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno and Davidon–Fletcher–Powell methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006)

    Google Scholar 

  2. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  3. Oren, S.S.: Self-scaling variable metric (SSVM) algorithms. II. Implementation and experiments. Manag. Sci. 20(5), 863–874 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  4. Oren, S.S., Luenberger, D.G.: Self-scaling variable metric (SSVM) algorithms. I. Criteria and sufficient conditions for scaling a class of algorithms. Manag. Sci. 20(5), 845–862 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  5. Oren, S.S., Spedicato, E.: Optimal conditioning of self-scaling variable metric algorithms. Math. Program. 10(1), 70–90 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  6. Yin, H.X., Du, D.L.: The global convergence of self-scaling BFGS algorithm with nonmonotone line search for unconstrained nonconvex optimization problems. Acta Math. Sin. Engl. Ser. 23(7), 1233–1240 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  7. Barzilai, J., Borwein, J.M.: Two-point stepsize gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  8. Andrei, N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20(6), 645–650 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  9. Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38(3), 401–416 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  10. Andrei, N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22(4), 561–571 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  11. Andrei, N.: A scaled nonlinear conjugate gradient algorithm for unconstrained optimization. Optimization 57(4), 549–570 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  12. Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  13. Babaie-Kafaki, S.: A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei. Comput. Optim. Appl. 52(2), 409–414 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  14. Babaie-Kafaki, S.: A new proof for the sufficient descent condition of Andrei’s scaled conjugate gradient algorithms. Pac. J. Optim. 9(1), 23–28 (2013)

    MathSciNet  MATH  Google Scholar 

  15. Babaie-Kafaki, S.: A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization. 4OR 11(4), 361–374 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  16. Babaie-Kafaki, S.: Two modified scaled nonlinear conjugate gradient methods. J. Comput. Appl. Math. 261(5), 172–182 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  17. Babaie-Kafaki, S., Ghanbari, R.: A modified scaled conjugate gradient method with global convergence for nonconvex functions. Bull. Belg. Math. Soc. Simon Stevin 21(3), 465–477 (2014)

    MathSciNet  MATH  Google Scholar 

  18. Li, E.D., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1–2), 15–35 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  19. Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21(5), 707–714 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  20. Zhang, J.Z., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102(1), 147–167 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  21. Zhang, J., Xu, C.: Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J. Comput. Appl. Math. 137(2), 269–278 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  22. Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234(5), 1374–1386 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  23. Yuan, Y.X., Byrd, R.H.: Non-quasi-Newton updates for unconstrained optimization. J. Comput. Math. 13(2), 95–107 (1995)

    MathSciNet  MATH  Google Scholar 

  24. Wei, Z., Li, G., Qi, L.: New quasi-Newton methods for unconstrained optimization problems. Appl. Math. Comput. 175(2), 1156–1188 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  25. Babaie-Kafaki, S.: A modified BFGS algorithm based on a hybrid secant equation. Sci. China Math. 54(9), 2019–2036 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  26. Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202(2), 523–539 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  27. Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  28. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  29. Watkins, D.S.: Fundamentals of Matrix Computations. Wiley, New York (2002)

    Book  MATH  Google Scholar 

  30. Babaie-Kafaki, S.: A quadratic hybridization of Polak–Ribière–Polyak and Fletcher–Reeves conjugate gradient methods. J. Optim. Theory Appl. 154(3), 916–932 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  31. Dolan, D.H., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2, Ser. A), 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  32. Andrei, N.: Open problems in conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. 34(2), 319–330 (2011)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

This research was supported by a grant from IPM (No. 93650051). The author is grateful to the anonymous reviewers for their valuable comments and suggestions helped to improve the presentation. He also thanks Professor Michael Navon for providing the line search code.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Saman Babaie-Kafaki.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Babaie-Kafaki, S. On Optimality of the Parameters of Self-Scaling Memoryless Quasi-Newton Updating Formulae. J Optim Theory Appl 167, 91–101 (2015). https://doi.org/10.1007/s10957-015-0724-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-015-0724-x

Keywords

Mathematics Subject Classification

Navigation