Skip to main content
Log in

A Double-Parameter Scaling Broyden–Fletcher–Goldfarb–Shanno Method Based on Minimizing the Measure Function of Byrd and Nocedal for Unconstrained Optimization

Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

In this paper, the first two terms on the right-hand side of the Broyden–Fletcher–Goldfarb–Shanno update are scaled with a positive parameter, while the third one is also scaled with another positive parameter. These scaling parameters are determined by minimizing the measure function introduced by Byrd and Nocedal (SIAM J Numer Anal 26:727–739, 1989). The obtained algorithm is close to the algorithm based on clustering the eigenvalues of the Broyden–Fletcher–Goldfarb–Shanno approximation of the Hessian and on shifting its large eigenvalues to the left, but it is not superior to it. Under classical assumptions, the convergence is proved by using the trace and the determinant of the iteration matrix. By using a set of 80 unconstrained optimization test problems, it is proved that the algorithm minimizing the measure function of Byrd and Nocedal is more efficient and more robust than some other scaling Broyden–Fletcher–Goldfarb–Shanno algorithms, including the variants of Biggs (J Inst Math Appl 12:337–338, 1973), Yuan (IMA J Numer Anal 11:325–332, 1991), Oren and Luenberger (Manag Sci 20:845–862, 1974) and of Nocedal and Yuan (Math Program 61:19–37, 1993). However, it is less efficient than the algorithms based on clustering the eigenvalues of the iteration matrix and on shifting its large eigenvalues to the left, as shown by Andrei (J Comput Appl Math 332:26–44, 2018, Numer Algorithms 77:413–432, 2018).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

References

  1. Dennis, J.E., Moré, J.J.: A characterization of superlinear convergence and its application to quasi-Newton methods. Math. Comput. 28, 549–560 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  2. Dennis, J.E., Moré, J.J.: Quasi-Newton methods, motivation and theory. SIAM Rev. 19, 46–89 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  3. Nocedal, J.: Theory of algorithms for unconstrained optimization. Acta Numer. 1, 199–242 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  4. Powell, M.J.D.: Some global convergence properties of a variable metric algorithm for minimization without exact line search. In: Cottle, R.W., Lemke, C.E. (eds.) Nonlinear Programming, SIAM-AMS Proceedings, vol. IX, pp. 53–72. SIAM, Philadelphia (1976)

    Google Scholar 

  5. Byrd, R., Nocedal, J.: A tool for the analysis of quasi-Newton methods with application to unconstrained minimization. SIAM J. Numer. Anal. 26, 727–739 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  6. Byrd, R., Nocedal, J., Yuan, Y.: Global convergence of a class of quasi-Newton methods on convex problems. SIAM J. Numer. Anal. 24, 1171–1189 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  7. Dixon, L.C.W.: Variable metric algorithms: necessary and sufficient conditions for identical behavior on nonquadratic functions. J. Optim. Theory Appl. 10, 34–40 (1972)

    Article  MathSciNet  MATH  Google Scholar 

  8. Griewank, A.: The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients. Math. Program. 50, 141–175 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  9. Powell, M.J.D.: On the convergence of the variable metric algorithm. J. Inst. Math. Appl. 7, 21–36 (1971)

    Article  MathSciNet  MATH  Google Scholar 

  10. Powell, M.J.D.: Updating conjugate directions by the BFGS formula. Math. Program. 38, 693–726 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  11. Mascarenhas, W.F.: The BFGS method with exact line searches fails for non-convex objective functions. Math. Program. Ser. A 99, 49–61 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  12. Dai, Y.-H.: Convergence properties of the BFGS Algorithm. SIAM J. Optim. 13, 693–701 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  13. Li, D.-H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129, 15–35 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  14. Li, D.-H., Fukushima, M.: On the global convergence of the BFGS method for nonconvex unconstrained optimization problems. SIAM J. Optim. 11(4), 1054–1064 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  15. Andrei, N.: Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization. Optim. Methods Softw. 32(3), 534–551 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  16. Byrd, R.H., Liu, D.C., Nocedal, J.: On the behavior of Broyden’s class of quasi-Newton methods. SIAM J. Optim. 2, 533–557 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  17. Gill, P.E., Leonard, M.W.: Reduced-Hessian quasi Newton methods for unconstrained optimization. SIAM J. Optim. 12, 209–237 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  18. Contreras, M., Tapia, R.A.: Sizing the BFGS and DFP updates: a numerical study. J. Optim. Theory Appl. 78, 93–108 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  19. Oren, S.S., Luenberger, D.G.: Self-scaling variable metric (SSVM) algorithms, part I: criteria and sufficient conditions for scaling a class of algorithms. Manag. Sci. 20, 845–862 (1974)

    Article  MATH  Google Scholar 

  20. Oren, S.S., Spedicato, E.: Optimal conditioning of self-scaling variable metric algorithm. Math. Program. 10, 70–90 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  21. Shanno, D.F., Phua, K.H.: Matrix conditioning and nonlinear optimization. Math. Program. 14, 149–160 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  22. Yabe, H., Martinez, H.J., Tapia, R.A.: On sizing and shifting the BFGS update within the sized Broyden family of secant updates. SIAM J. Optim. 15(1), 139–160 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  23. Andrei, N.: An adaptive scaled BFGS method for unconstrained optimization. Numer. Algorithms 77(2), 413–432 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  24. Andrei, N.: A double parameter scaled BFGS method for unconstrained optimization. J. Comput. Appl. Math. 332, 26–44 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  25. Biggs, M.C.: Minimization algorithms making use of non-quadratic properties of the objective function. J. Inst. Math. Appl. 8, 315–327 (1971)

    Article  MATH  Google Scholar 

  26. Biggs, M.C.: A note on minimization algorithms making use of non-quadratic properties of the objective function. J. Inst. Math. Appl. 12, 337–338 (1973)

    Article  MathSciNet  MATH  Google Scholar 

  27. Liao, A.: Modifying BFGS method. Oper. Res. Lett. 20, 171–177 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  28. Nocedal, J., Yuan, Y.X.: Analysis of self-scaling quasi-Newton method. Math. Program. 61, 19–37 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  29. Oren, S.S.: Self-scaling variable metric algorithms for unconstrained minimization. In: Ph.D. Thesis, Department of Engineering-Economic Systems, Stanford University, Stanford (1972)

  30. Al-Baali, M.: Numerical experience with a class of self-scaling quasi-Newton algorithms. J. Optim. Theory Appl. 96, 533–553 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  31. Al-Baali, M., Grandinetti, L.: On practical modifications of the quasi-Newton BFGS method. AMO-Adv. Modell. Optim. 11(1), 63–76 (2009)

    MathSciNet  MATH  Google Scholar 

  32. Arzam, M.R., Babaie-Kafaki, S., Ghanbari, R.: An extended Dai-Liao conjugate gradient method with global convergence for nonconvex functions. Glasnik Matematicki 52(72), 361–375 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  33. Wei, Z., Yu, G., Yuan, G., Lian, Z.: The superlinear converence of a modified BFGS-type method for unconstrained optimization. Comput. Optim. Appl. 29, 315–332 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  34. Wu, G., Liang, H.: A modified BFGS method and its convergence. Comput. Modell. New Technol. 18(11), 43–47 (2014)

    Google Scholar 

  35. Yabe, H., Ogasawara, H., Yoshino, M.: Local and superlinear convergence of quasi-Newton methods based on modified secant conditions. J. Comput. Appl. Math. 205, 632–717 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  36. Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11, 325–332 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  37. Yuan, Y.X., Byrd, R.: Non-quasi-Newton updates for unconstrained optimization. J. Comput. Math. 13(2), 95–107 (1995)

    MathSciNet  MATH  Google Scholar 

  38. Yuan, G., Wei, Z.: Convergence analysis of a modified BFGS method on convex minimizations. Comput. Optim. Appl. 47, 237–255 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  39. Zhang, J., Deng, N.Y., Chen, L.H.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102, 147–167 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  40. Zhu, H., Wen, S.: A class of generalized quasi-Newton algorithms with superlinear convergence. Int. J. Nonlinear Sci. 2(3), 140–146 (2006)

    MathSciNet  Google Scholar 

  41. Zhang, J., Xu, C.: Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J. Comput. Appl. Math. 137, 269–278 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  42. Wan, Z., Huang, S., Zheng, X.D.: New cautious BFGS algorithm based on modified Armijo-type line search. J. Inequal. Appl. 241, 1–10 (2012)

    MathSciNet  MATH  Google Scholar 

  43. Wan, Z., Teo, K.L., Shen, X.L., Hu, C.M.: New BFGS method for unconstrained optimization problem based on modified Armijo line search. Optimization 63(2), 285–304 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  44. Yuan, G., Sheng, Z., Wang, B., Hu, W., Li, C.: The global convergence of a modified BFGS method for nonconvex functions. J. Comput. Appl. Math. 327, 274–294 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  45. Yuan, G., Wei, Z., Lu, X.: Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search. Appl. Math. Model. 47, 811–825 (2017)

    Article  MathSciNet  Google Scholar 

  46. Cheng, W.Y., Li, D.H.: Spectral scaling BFGS method. J. Optim. Theory Appl. 146, 305–319 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  47. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  48. Wolfe, P.: Convergence conditions for ascent methods. II: some corrections. SIAM Rev. 13, 185–188 (1971)

    Article  MathSciNet  MATH  Google Scholar 

  49. Wang, H.J., Yuan, Y.X.: A quadratic convergence method for one-dimensional optimization. Chin. J. Oper. Res. 11, 1–10 (1992)

    Google Scholar 

  50. Powell, M.J.D.: How bad are the BFGS and DFP methods when the objective function is quadratic? Math. Program. 34, 34–47 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  51. Barzilai, J., Borwein, J.M.: Two-points step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  52. Hestenes, M.R., Stiefel, E.L.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  53. Andrei, N.: A new three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 68, 305–321 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  54. Andrei, N.: An adaptive conjugate gradient algorithm for large-scale unconstrained optimization. J. Comput. Appl. Math. 292, 83–91 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  55. Babaie-Kafaki, S.: On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae. J. Optim. Theory Appl. 167(1), 91–101 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  56. Babaie-Kafaki, S.: A modified scaling parameter for the memoryless BFGS updating formula. Numer. Algorithms 72(2), 425–433 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  57. Fletcher, R.: An overview of unconstrained optimization. In: Spedicato, E. (ed.) Algorithms for Continuous Optimization: The State of the Art, pp. 109–143. Kluwer Academic Publishers, Boston (1994)

    Chapter  Google Scholar 

  58. Andrei, N.: Continuous Nonlinear Optimization for Engineering Applications in GAMS Technology. Springer Optimization and Its Applications, vol. 121. Springer, Berlin (2017)

    MATH  Google Scholar 

  59. Zoutendjik, G.: Nonlinear programming, computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)

    Google Scholar 

  60. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. Electron. Int. J. 10, 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  61. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  62. Todd, M.J.: Quasi-Newton updates in abstract spaces. SIAM Rev. 26, 367–377 (1984)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Neculai Andrei.

Additional information

Communicated by Ilio Galligani.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Andrei, N. A Double-Parameter Scaling Broyden–Fletcher–Goldfarb–Shanno Method Based on Minimizing the Measure Function of Byrd and Nocedal for Unconstrained Optimization. J Optim Theory Appl 178, 191–218 (2018). https://doi.org/10.1007/s10957-018-1288-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-018-1288-3

Keywords

Mathematics Subject Classification

Navigation