Abstract
This paper considers simple modifications of the limited memory BFGS (L-BFGS) method for large scale optimization. It describes algorithms in which alternating ways of re-using a given set of stored difference vectors are outlined. The proposed algorithms resemble the L-BFGS method, except that the initial Hessian approximation is defined implicitly like the L-BFGS Hessian in terms of some stored vectors rather than the usual choice of a multiple of the unit matrix. Numerical experiments show that the new algorithms yield desirable improvement over the L-BFGS method.
Similar content being viewed by others
References
M. Al-Baali, New initial Hessian approximations for the limited memory BFGS method for large scale optimization, J. Fac. Sci., UAE University 14 (1995) 167–175.
M. Al-Baali, Improved limited memory BFGS methods for large scale optimization, Technical Report DOMAS 97/1, Dept. of Mathematics and Statistics, Sultan Qaboos University, Oman (1997).
M. Al-Baali, Extra updates for the BFGS method, Technical Report DOMAS 98/1, Dept. of Mathematics and Statistics, Sultan Qaboos University, Oman (1998).
M. Al-Baali, Numerical experience with a class of self-scaling quasi-Newton algorithms, J. Optim. Theory Appl. 96 (1998) 533–553.
B.M. Averick and J.J. Moré, Evaluation of large-scale optimization problems on vector and parallel architectures, SIAM J. Optim. 4 (1994) 708–721.
R.H. Byrd, J. Nocedal and C. Zhu, Towards a discrete Newton method with memory for large scale optimization, Technical Report OTC 95/01, Optimization Technology Center, Argonne National Laboratory and Northwestern University, USA (1995).
R. Fletcher, Practical Methods of Optimization, 2nd ed. (Wiley, Chichester, UK, 1987).
R. Fletcher, Low storage methods for unconstrained optimization, in: Computational Solution of Nonlinear Systems of Equations, eds. E.L. Allgower and K. George, Lectures in Applied Mathematics, Vol. 26 (Amer. Math. Soc., Providence, RI, 1990).
R. Fletcher, An optimal positive definite update for sparse Hessian matrices, SIAM J. Optim. 5 (1995) 192–218.
J.C. Gilbert and C. Lemaréchal, Some numerical experiments with variable storage quasi-Newton algorithms, Math. Programming 45 (1989) 407–436.
D.C. Liu and J. Nocedal, On the limited memory BFGS method for large scale optimization, Math. Programming 45 (1989) 503–528.
J.J. Moré, B.S. Garbow and K.E. Hillstrom, Testing unconstrained optimization software, ACM Trans. Math. Software 7 (1981) 17–41.
S.G. Nash and J. Nocedal, A numerical study of the limited memory BFGS method and the truncated-Newton method for large scale optimization, SIAM J. Optim. 1 (1991) 358–372.
J. Nocedal, Updating quasi-Newton matrices with limited storage, Math. Comp. 35 (1980) 773–782.
S.S. Oren and D.G. Luenberger, Self-scaling variable metric (SSVM) algorithms, part I: Criteria and sufficient conditions for scaling a class of algorithms, Managm. Sci. 20 (1974) 845–862.
D.F. Shanno and K.H. Phua, Matrix conditioning and nonlinear optimization, Math. Programming 14 (1978) 149–160.
D. Siegel, Implementing and modifying Broyden class updates for large scale optimization, Technical Report NA12, Dept. of Applied Mathematics and Theoretical Physics, Cambridge University, England (1992).
P.L. Toint, Some numerical results using a sparse matrix updating formula in unconstrained optimization, Math. Comp. 32 (1978) 839–851.
P.L. Toint, Test problems for partially separable optimization and results for the routine PSPMIN, Report No. 83/4, Dept. of Mathematics, University of Namur, Belgium (1983).
X. Zou, I.M. Navon, M. Berger, K.H. Phua, T. Schlick and F.X. Le Dimet, Numerical experience with limited-memory quasi-Newton and truncated Newton methods, SIAM J. Optim. 3 (1993) 582–608.
Rights and permissions
About this article
Cite this article
Al‐Baali, M. Improved Hessian approximations for the limited memory BFGS method. Numerical Algorithms 22, 99–112 (1999). https://doi.org/10.1023/A:1019142304382
Issue Date:
DOI: https://doi.org/10.1023/A:1019142304382