Skip to main content
Log in

Improved Hessian approximations for the limited memory BFGS method

  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

This paper considers simple modifications of the limited memory BFGS (L-BFGS) method for large scale optimization. It describes algorithms in which alternating ways of re-using a given set of stored difference vectors are outlined. The proposed algorithms resemble the L-BFGS method, except that the initial Hessian approximation is defined implicitly like the L-BFGS Hessian in terms of some stored vectors rather than the usual choice of a multiple of the unit matrix. Numerical experiments show that the new algorithms yield desirable improvement over the L-BFGS method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. M. Al-Baali, New initial Hessian approximations for the limited memory BFGS method for large scale optimization, J. Fac. Sci., UAE University 14 (1995) 167–175.

    Google Scholar 

  2. M. Al-Baali, Improved limited memory BFGS methods for large scale optimization, Technical Report DOMAS 97/1, Dept. of Mathematics and Statistics, Sultan Qaboos University, Oman (1997).

    Google Scholar 

  3. M. Al-Baali, Extra updates for the BFGS method, Technical Report DOMAS 98/1, Dept. of Mathematics and Statistics, Sultan Qaboos University, Oman (1998).

    Google Scholar 

  4. M. Al-Baali, Numerical experience with a class of self-scaling quasi-Newton algorithms, J. Optim. Theory Appl. 96 (1998) 533–553.

    Article  MATH  MathSciNet  Google Scholar 

  5. B.M. Averick and J.J. Moré, Evaluation of large-scale optimization problems on vector and parallel architectures, SIAM J. Optim. 4 (1994) 708–721.

    Article  MATH  MathSciNet  Google Scholar 

  6. R.H. Byrd, J. Nocedal and C. Zhu, Towards a discrete Newton method with memory for large scale optimization, Technical Report OTC 95/01, Optimization Technology Center, Argonne National Laboratory and Northwestern University, USA (1995).

    Google Scholar 

  7. R. Fletcher, Practical Methods of Optimization, 2nd ed. (Wiley, Chichester, UK, 1987).

    MATH  Google Scholar 

  8. R. Fletcher, Low storage methods for unconstrained optimization, in: Computational Solution of Nonlinear Systems of Equations, eds. E.L. Allgower and K. George, Lectures in Applied Mathematics, Vol. 26 (Amer. Math. Soc., Providence, RI, 1990).

    Google Scholar 

  9. R. Fletcher, An optimal positive definite update for sparse Hessian matrices, SIAM J. Optim. 5 (1995) 192–218.

    Article  MATH  MathSciNet  Google Scholar 

  10. J.C. Gilbert and C. Lemaréchal, Some numerical experiments with variable storage quasi-Newton algorithms, Math. Programming 45 (1989) 407–436.

    Article  MATH  MathSciNet  Google Scholar 

  11. D.C. Liu and J. Nocedal, On the limited memory BFGS method for large scale optimization, Math. Programming 45 (1989) 503–528.

    Article  MATH  MathSciNet  Google Scholar 

  12. J.J. Moré, B.S. Garbow and K.E. Hillstrom, Testing unconstrained optimization software, ACM Trans. Math. Software 7 (1981) 17–41.

    Article  MATH  MathSciNet  Google Scholar 

  13. S.G. Nash and J. Nocedal, A numerical study of the limited memory BFGS method and the truncated-Newton method for large scale optimization, SIAM J. Optim. 1 (1991) 358–372.

    Article  MATH  MathSciNet  Google Scholar 

  14. J. Nocedal, Updating quasi-Newton matrices with limited storage, Math. Comp. 35 (1980) 773–782.

    Article  MATH  MathSciNet  Google Scholar 

  15. S.S. Oren and D.G. Luenberger, Self-scaling variable metric (SSVM) algorithms, part I: Criteria and sufficient conditions for scaling a class of algorithms, Managm. Sci. 20 (1974) 845–862.

    Article  MATH  MathSciNet  Google Scholar 

  16. D.F. Shanno and K.H. Phua, Matrix conditioning and nonlinear optimization, Math. Programming 14 (1978) 149–160.

    Article  MATH  MathSciNet  Google Scholar 

  17. D. Siegel, Implementing and modifying Broyden class updates for large scale optimization, Technical Report NA12, Dept. of Applied Mathematics and Theoretical Physics, Cambridge University, England (1992).

    Google Scholar 

  18. P.L. Toint, Some numerical results using a sparse matrix updating formula in unconstrained optimization, Math. Comp. 32 (1978) 839–851.

    Article  MATH  MathSciNet  Google Scholar 

  19. P.L. Toint, Test problems for partially separable optimization and results for the routine PSPMIN, Report No. 83/4, Dept. of Mathematics, University of Namur, Belgium (1983).

    Google Scholar 

  20. X. Zou, I.M. Navon, M. Berger, K.H. Phua, T. Schlick and F.X. Le Dimet, Numerical experience with limited-memory quasi-Newton and truncated Newton methods, SIAM J. Optim. 3 (1993) 582–608.

    Article  MATH  MathSciNet  Google Scholar 

Download references

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Al‐Baali, M. Improved Hessian approximations for the limited memory BFGS method. Numerical Algorithms 22, 99–112 (1999). https://doi.org/10.1023/A:1019142304382

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1019142304382

Navigation