Abstract
An explicit formula based on matrix algebra to approximate the diagonal entries of a Hessian matrix with any number of sample points is introduced. When the derivative-free technique called generalized centered simplex gradient is used to approximate the gradient, then the formula can be computed for only one additional function evaluation. An error bound is introduced and provides information on the form of the sample set of points that should be used to approximate the diagonal of a Hessian matrix. If the sample set of points is built in a specific manner, it is shown that the technique is a \(\mathcal {O}({{\Delta }_{S}^{2}})\) accurate approximation of the diagonal entries of the Hessian matrix, where ΔS is the radius of the sample set.
Similar content being viewed by others
Data availability
The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.
References
Audet, C.: A survey on direct search methods for blackbox optimization and their applications. In: Mathematics without Boundaries: Surveys in Interdisciplinary Research, pp 31–56. Springer, New York (2014)
Audet, C., Hare, W.: Derivative-Free and Blackbox Optimization. Springer Series in Operations Research and Financial Engineering. Springer, Switzerland (2017)
Billups, S., Larson, J., Graf, P.: Derivative-free optimization of expensive functions with computational error using weighted regression. SIAM J. Optim. 23(1), 27–53 (2013)
Bortz, D., Kelley, C.: The simplex gradient and noisy optimization problems. In: Computational Methods for Optimal Design and Control, pp 77–90. Springer (1998)
Conn, A., Scheinberg, K., Vicente, L.: Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation. IMA J. Numer. Anal. 28(4), 721–748 (2008)
Conn, A., Scheinberg, K., Vicente, L.: Introduction to Derivative-free Optimization, vol. 8. SIAM, Philadelphia, PA (2009)
Coope, I., Tappenden, R.: Gradient and diagonal Hessian approximations using quadratic interpolation models and aligned regular bases. Numerical Algorithms, pp. 1–25 (2021)
Custódio, A., Scheinberg, K., Vicente, L.: Methodologies and software for derivative-free optimization. In: Advances and Trends in Optimization with Engineering Applications, Chapter 37, pp 495–506. SIAM, Philadelphia, PA (2017)
Custódio, A., Vicente, L.: Using sampling and simplex derivatives in pattern search methods. SIAM J. Optim. 18(2), 537–555 (2007)
Golub, G., Van Loan, C.: Matrix Computations, 3rd edn. Johns Hopkins University Press, USA (1996)
Hare, W., Jarry-Bolduc, G.: Calculus identities for generalized simplex gradients: Rules and applications. SIAM J. Optim. 30(1), 853–884 (2020)
Hare, W., Jarry-Bolduc, G., Planiden, C.: Error bounds for overdetermined and underdetermined generalized centred simplex gradients. IMA Journal of Numerical Analysis (2020)
Hare, W., Jarry-Bolduc, G., Planiden, C.: Hessian approximations. arXiv preprint arXiv:2011.02584 (2020)
Hare, W., Nutini, J., Tesfamariam, S.: A survey of non-gradient optimization methods in structural engineering. Adv. Eng. Softw. 59, 19–28 (2013)
Horn, R., Johnson, C.: Matrix Analysis. Cambridge University Press, Cambridge (1990)
Kelley, C.: Iterative Methods for Optimization, vol. 18. SIAM, Philadelphia, PA (1999)
Larson, J., Menickelly, M., Wild, S.: Derivative-free optimization methods. Acta Numerica 28, 287–404 (2019)
Lax, P., Terrell, M.: Multivariable Calculus with Applications. Springer, Switzerland (2017)
Oeuvray, R., Bierlaire, M.: Boosters: A derivative-free algorithm based on radial basis functions. Int. J. Model. Simul. 29(1), 26–36 (2009)
Powell, M.: Least Frobenius norm updating of quadratic models That satisfy interpolation conditions. Math. Program. 100(1), 183–215 (2004)
Regis, R.: The calculus of simplex gradients. Optim. Lett. 9(5), 845–865 (2015)
Regis, R., Shoemaker, C.: Constrained global optimization of expensive black box functions using radial basis functions. J. Glob. Optim. 31, 153–171 (2005)
Rockafellar, R., Wets, R.: Variational Analysis. Fundamental Principles of Mathematical Sciences. Springer, Berlin (1998)
Schonlau, M., Welch, W., Jones, D.: Global versus local search in constrained optimization of computer models. Lecture Notes Monograph Series, pp. 11–25 (1998)
Wang, G., Wei, Y., Qiao, S.: Generalized Inverses: Theory and Computations, vol. 53. Springer, Berlin (2018)
Wild, S., Shoemaker, C.: Global convergence of radial basis function trust region derivative-free algorithms. SIAM J. Optim. 21(3), 761–781 (2011)
Acknowledgements
Jarry-Bolduc would like to acknowledge UBC for the funding received through the University Graduate Fellowship award.
Funding
Jarry-Bolduc’s research is partially funded by the Natural Sciences and Engineering Research Council (NSERC) of Canada, Discover Grant #2018-03865.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The author declares no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Jarry–Bolduc, G. Approximating the diagonal of a Hessian: which sample set of points should be used. Numer Algor 91, 1349–1361 (2022). https://doi.org/10.1007/s11075-022-01304-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-022-01304-z