Abstract
A new quasi-Newton method with a diagonal updating matrix is suggested, where the diagonal elements are determined by forward or by central finite differences. The search direction is a direction of sufficient descent. The algorithm is equipped with an acceleration scheme. The convergence of the algorithm is linear. The preliminary computational experiments use a set of 75 unconstrained optimization test problems classified in five groups according to the structure of their Hessian: diagonal, block-diagonal, band (tri- or penta-diagonal), sparse and dense. Subject to the CPU time metric, intensive numerical experiments show that, for problems with Hessian in a diagonal, block-diagonal or band structure, the algorithm with diagonal approximation of the Hessian by finite differences is top performer versus the well-established algorithms: the steepest descent and the Broyden–Fletcher–Goldfarb–Shanno. On the other hand, as a by-product of this numerical study, we show that the Broyden–Fletcher–Goldfarb–Shanno algorithm is faster for problems with sparse Hessian, followed by problems with dense Hessian.
Similar content being viewed by others
References
Dennis Jr., J.E., Moré, J.J.: A characterization of superlinear convergence and its application to quasi-Newton methods. Mathematics of Computation 28, 549–560 (1974)
Dennis Jr., J.E., Moré, J.J.: Quasi-Newton methods, motivation and theory. SIAM Review 19, 46–89 (1977)
Nocedal, J.: Theory of algorithms for unconstrained optimization. Acta Numerica 1, 199–242 (1992)
Dennis Jr., J.E., Schnabel, R.B.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Prentice-Hall, Inc., Englewood Cliffs, NJ (1983)
Gill, P.E., Murray, W., Wright, M.H.: Practical Optimization. Academic Press, London (1981)
Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer, Berlin (2006)
Bartholomew-Biggs, M.: Nonlinear Optimization with Engineering Applications. Springer Optimization and Its Applications, vol. 19. Springer, Berlin (2008)
Andrei, N.: Continuous Nonlinear Optimization for Engineering Applications in GAMS Technology. Springer Optimization and Its Applications, vol. 121. Springer, Berlin (2017)
Nazareth, J.L.: If quasi-Newton then why not quasi-Cauchy? SIAG/Opt Views-and-news 6, 11–14 (1995)
Dennis, J.E., Wolkowicz, H.: Sizing and least-change secant methods. SIAM J. Numerical Analysis 30, 1291–1314 (1993)
Zhu, M., Nazareth, J.L., Wolkowicz, H.: The quasi-Cauchy relation and diagonal updating. SIAM J. Optim. 9(4), 1192–1204 (1999)
Leong, W.J., Farid, M., Hassan, M.A.: Improved Hessian approximation with modified quasi-Cauchy relation for a gradient-type method. Adv. Model. Optim. 12(1), 37–44 (2010)
Farid, M., Leong, W.J., Zheng, L.: A new diagonal gradient-type method for large scale unconstrained optimization. U.P.B. Sci. Bull. Ser. A 75(1), 57–64 (2013)
Andrei, N.: A diagonal quasi-Newton updating method for unconstrained optimization. Numerical Algorithms 81, 575–590 (2019)
Wolfe, P.: Convergence conditions for ascent methods. SIAM Review 11, 226–235 (1969)
Wolfe, P.: Convergence conditions for ascent methods. II: some corrections. SIAM Rev. 13, 185–188 (1971)
Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numerical Algorithms 42, 63–73 (2006)
Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213, 361–369 (2009)
Kelley, C.T.: Iterative Methods for Linear and Nonlinear Equations. SIAM, Philadelphia (1995)
Zoutendijk, G.: Nonlinear programming, computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)
Ortega, J.M., Rheinboldt, W.C.: Iterative Solution of Nonlinear Equations in Several Variables. Academic Press, London (1970)
Luenberger, D.G., Ye, Y.: Linear and Nonlinear Programming. International Series in Operations Research & Management Science, 4th edn. Springer, New York (2016)
Nash, S.G., Nocedal, J.: A numerical study of the limited memory BFGS method and the truncated-Newton method for large scale optimization. SIAM J. Optimization 1(3), 358–372 (1991)
Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Programming 45, 503–528 (1989)
Nash, S.G.: User’s Guide for TN/TNBC: Fortran Routines for Nonlinear Optimization. Report 397, Mathematical Sciences Department, The Johns Hopkins University, Baltimore (1984)
Nash, S.G.: Preconditioning of truncated-Newton methods. SIAM J. Sci. Statist. Comput. 6, 599–616 (1985)
Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, P.L.: CUTE: constrained and unconstrained testing environments. ACM TOMS 21, 123–160 (1995)
Andrei, N.: A Collection of 75 Unconstrained Optimization Test Problems. Technical Report No. 6/2018, May 10, 2018, pp. 1–9. Research Institute for Informatics, Bucharest (2018)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Gill, P.E., Murray, W.: Conjugate Gradient Methods for Large-Scale Nonlinear Optimization. Technical Report SOL 79-15, Department of Operations Research, Stanford University, Stanford (1979)
Gilbert, J.C., Lemaréchal, C.: Some numerical experiments with variable-storage quasi-Newton algorithms. Math. Programming Ser. B 45, 407–435 (1989)
Nazareth, J.L.: The Newton–Cauchy Framework: A Unified Approach to Unconstrained Nonlinear Minimization. Lecture Notes in Computer Science 769. Springer, New York (1994)
Dennis, J.E., Walker, H.F.: Inaccuracy in quasi-Newton methods: local improvement theorems. In: Mathematical Programming Study, Vol. 22: Mathematical Programming at Oberwolfach II, pp. 70–85. North-Holland, Amsterdam (1984)
Ypma, T.J.: The effect of rounding errors on Newton-like methods. IMA J. Numer. Anal. 3, 109–118 (1983)
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Evgeni Alekseevich Nurminski.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Dr. Neculai Andrei is full member of Academy of Romanian Scientists.
Rights and permissions
About this article
Cite this article
Andrei, N. Diagonal Approximation of the Hessian by Finite Differences for Unconstrained Optimization. J Optim Theory Appl 185, 859–879 (2020). https://doi.org/10.1007/s10957-020-01676-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10957-020-01676-z
Keywords
- Unconstrained optimization
- Diagonal quasi-Newton update
- Forward differences
- Central differences
- Numerical comparisons