Skip to main content
Log in

An adaptive scaled BFGS method for unconstrained optimization

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

A new adaptive scaled Broyden-Fletcher-Goldfarb-Shanno (BFGS) method for unconstrained optimization is presented. The third term in the standard BFGS update formula is scaled in order to reduce the large eigenvalues of the approximation to the Hessian of the minimizing function. Under the inexact Wolfe line search conditions, the global convergence of the adaptive scaled BFGS method is proved in very general conditions without assuming the convexity of the minimizing function. Using 80 unconstrained optimization test functions with a medium number of variables, the preliminary numerical experiments show that this variant of the scaled BFGS method is more efficient than the standard BFGS update or than some other scaled BFGS methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Al-Baali, M.: Analysis of a family of self-scaling quasi-Newton methods. Technical Report, Department of Mathematics and Computer Science. United Arab Emirates University (1993)

  2. Al-Baali, M.: Global and superlinear convergence of a class of self-scaling methods with inexact line searches. Comput. Optim. Appl. 9, 191–203 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  3. Al-Baali, M.: Numerical experience with a class of self-scaling quasi-Newton algorithms. J. Optim. Theory Appl. 96, 533–553 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  4. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim.—an Electronic International Journal 10, 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  5. Barzilai, J., Borwein, J.M.: Two-points step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  6. Biggs, M.C.: Minimization algorithms making use of non-quadratic properties of the objective function. J. Inst. Math. Appl. 8, 315–327 (1971)

    Article  MATH  Google Scholar 

  7. Biggs, M.C.: A note on minimization algorithms making use of non-quadratic properties of the objective function. J. Inst. Math. Appl. 12, 337–338 (1973)

    Article  MathSciNet  MATH  Google Scholar 

  8. Broyden, C.G.: The convergence of a class of double-rank minimization algorithms. I. General considerations. J. Inst. Math. Appl. 6, 76–90 (1970)

    Article  MATH  Google Scholar 

  9. Byrd, R.H., Liu, D.C., Nocedal, J.: On the behavior of Broyden’s class of quasi-Newton methods. SIAM J. Optim. 2, 533–557 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  10. Byrd, R., Nocedal, J., Yuan, Y.: Global convergence of a class of quasi-Newton methods on convex problems. SIAM J. Numer. Anal. 24, 1171–1189 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  11. Byrd, R., Nocedal, J.: A tool for the analysis of quasi-Newton methods with application to unconstrained minimization. SIAM J. Numer. Anal. 26, 727–739 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  12. Cheng, W.Y., Li, D.H.: Spectral scaling BFGS method. J. Optim. Theory Appl. 146, 305–319 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  13. Dai, Y.-H.: Convergence properties of the BFGS Algorithm. SIAM J. Optim. 13(3), 693–701 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  14. Dennis, J.E., Moré, J.J.: A characterization of superlinear convergence and its application to quasi-Newton methods. Math. Comput. 28, 549–560 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  15. Dennis, J.E., Moré, J.J.: Quasi-Newton methods, motivation and theory. SIAM Rev. 19, 46–89 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  16. Dixon, L.C.W.: Variable metric algorithms: necessary and sufficient conditions for identical behavior on nonquadratic functions. J. Optim. Theory Appl. 10, 34–40 (1972)

    Article  MathSciNet  MATH  Google Scholar 

  17. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  18. Fletcher, R.: A new approach to variable metric algorithms. Comput. J. 13, 317–322 (1970)

    Article  MATH  Google Scholar 

  19. Fletcher, R.: An overview of unconstrained optimization. In: Spedicato, E. (ed.) Algorithms for Continuous Optimization: The State of the Art, pp. 109–143. Kluwer Academic Publishers, Boston (1994)

    Chapter  Google Scholar 

  20. Gill, P.E., Leonard, M.W.: Reduced-Hessian quasi Newton methods for unconstrained optimization. SIAM J. Optim. 12, 209–237 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  21. Goldfarb, D.: A family of variable metric methods derived by variation mean. Math. Comput. 23, 23–26 (1970)

    Article  MATH  Google Scholar 

  22. Griewank, A.: The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients. Math. Program. 50, 141–175 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  23. Hestenes, M.R., Stiefel, E.L.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  24. Li, D.-H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129, 15–35 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  25. Liao, A.: Modifying BFGS method. Oper. Res. Lett. 20, 171–177 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  26. Mascarenhas, W.F.: The BFGS method with exact line searches fails for non-convex objective functions. Math. Program. Ser. A 99, 49–61 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  27. Nocedal, J.: Theory of algorithms for unconstrained optimization. Acta Numer. 1, 199–242 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  28. Nocedal, J., Yuan, Y.: Analysis of self-scaling quasi-Newton method. Math. Program. 61, 19–37 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  29. Oren, S.S., Luenberger, G.G.: Self-scaling variable metric (SSVM) algorithms, part I: criteria and sufficient conditions for scaling a class of algorithms. Manag. Sci. 20, 845–862 (1974)

    Article  MATH  Google Scholar 

  30. Powell, M.J.D.: On the convergence of the variable metric algorithm. J. Inst. Math. Appl. 7, 21–36 (1971)

    Article  MathSciNet  MATH  Google Scholar 

  31. Powell, M.J.D.: How bad are the BFGS and DFP methods when the objective function is quadratic. Math. Program. 34, 34–47 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  32. Powell, M.J.D.: Updating conjugate directions by the BFGS formula. Math. Program. 38, 693–726 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  33. Shanno, D.F.: Conditioning of quasi-Newton methods for function minimization. Math. Comput. 24, 647–656 (1970)

    Article  MathSciNet  MATH  Google Scholar 

  34. Sun, W., Yuan, Y.X.: Optimization theory and methods. Nonlinear programming. Springer Science + Business Media, New York (2006)

    MATH  Google Scholar 

  35. Wang, H.J., Yuan, Y.: A quadratic convergence method for one-dimensional optimization. Chin. J. Oper. Res. 11, 1–10 (1992)

    Google Scholar 

  36. Yuan, Y.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11, 325–332 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  37. Zhang, J.Z., Xu, C.X.: Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J. Comput. Appl. Math. 139, 269–278 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  38. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  39. Wolfe, P.: Convergence conditions for ascent methods. II: some corrections. SIAM Rev. 13, 185–188 (1971)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Neculai Andrei.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Andrei, N. An adaptive scaled BFGS method for unconstrained optimization. Numer Algor 77, 413–432 (2018). https://doi.org/10.1007/s11075-017-0321-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-017-0321-1

Keywords

Mathematics Subject Classification (2010)

Navigation