Skip to main content
Log in

Approximating the diagonal of a Hessian: which sample set of points should be used

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

An explicit formula based on matrix algebra to approximate the diagonal entries of a Hessian matrix with any number of sample points is introduced. When the derivative-free technique called generalized centered simplex gradient is used to approximate the gradient, then the formula can be computed for only one additional function evaluation. An error bound is introduced and provides information on the form of the sample set of points that should be used to approximate the diagonal of a Hessian matrix. If the sample set of points is built in a specific manner, it is shown that the technique is a \(\mathcal {O}({{\Delta }_{S}^{2}})\) accurate approximation of the diagonal entries of the Hessian matrix, where ΔS is the radius of the sample set.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Data availability

The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.

References

  1. Audet, C.: A survey on direct search methods for blackbox optimization and their applications. In: Mathematics without Boundaries: Surveys in Interdisciplinary Research, pp 31–56. Springer, New York (2014)

  2. Audet, C., Hare, W.: Derivative-Free and Blackbox Optimization. Springer Series in Operations Research and Financial Engineering. Springer, Switzerland (2017)

    Book  Google Scholar 

  3. Billups, S., Larson, J., Graf, P.: Derivative-free optimization of expensive functions with computational error using weighted regression. SIAM J. Optim. 23(1), 27–53 (2013)

    Article  MathSciNet  Google Scholar 

  4. Bortz, D., Kelley, C.: The simplex gradient and noisy optimization problems. In: Computational Methods for Optimal Design and Control, pp 77–90. Springer (1998)

  5. Conn, A., Scheinberg, K., Vicente, L.: Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation. IMA J. Numer. Anal. 28(4), 721–748 (2008)

    Article  MathSciNet  Google Scholar 

  6. Conn, A., Scheinberg, K., Vicente, L.: Introduction to Derivative-free Optimization, vol. 8. SIAM, Philadelphia, PA (2009)

    Book  Google Scholar 

  7. Coope, I., Tappenden, R.: Gradient and diagonal Hessian approximations using quadratic interpolation models and aligned regular bases. Numerical Algorithms, pp. 1–25 (2021)

  8. Custódio, A., Scheinberg, K., Vicente, L.: Methodologies and software for derivative-free optimization. In: Advances and Trends in Optimization with Engineering Applications, Chapter 37, pp 495–506. SIAM, Philadelphia, PA (2017)

  9. Custódio, A., Vicente, L.: Using sampling and simplex derivatives in pattern search methods. SIAM J. Optim. 18(2), 537–555 (2007)

    Article  MathSciNet  Google Scholar 

  10. Golub, G., Van Loan, C.: Matrix Computations, 3rd edn. Johns Hopkins University Press, USA (1996)

    MATH  Google Scholar 

  11. Hare, W., Jarry-Bolduc, G.: Calculus identities for generalized simplex gradients: Rules and applications. SIAM J. Optim. 30(1), 853–884 (2020)

    Article  MathSciNet  Google Scholar 

  12. Hare, W., Jarry-Bolduc, G., Planiden, C.: Error bounds for overdetermined and underdetermined generalized centred simplex gradients. IMA Journal of Numerical Analysis (2020)

  13. Hare, W., Jarry-Bolduc, G., Planiden, C.: Hessian approximations. arXiv preprint arXiv:2011.02584 (2020)

  14. Hare, W., Nutini, J., Tesfamariam, S.: A survey of non-gradient optimization methods in structural engineering. Adv. Eng. Softw. 59, 19–28 (2013)

    Article  Google Scholar 

  15. Horn, R., Johnson, C.: Matrix Analysis. Cambridge University Press, Cambridge (1990)

    MATH  Google Scholar 

  16. Kelley, C.: Iterative Methods for Optimization, vol. 18. SIAM, Philadelphia, PA (1999)

    Book  Google Scholar 

  17. Larson, J., Menickelly, M., Wild, S.: Derivative-free optimization methods. Acta Numerica 28, 287–404 (2019)

    Article  MathSciNet  Google Scholar 

  18. Lax, P., Terrell, M.: Multivariable Calculus with Applications. Springer, Switzerland (2017)

    Book  Google Scholar 

  19. Oeuvray, R., Bierlaire, M.: Boosters: A derivative-free algorithm based on radial basis functions. Int. J. Model. Simul. 29(1), 26–36 (2009)

    Article  Google Scholar 

  20. Powell, M.: Least Frobenius norm updating of quadratic models That satisfy interpolation conditions. Math. Program. 100(1), 183–215 (2004)

    Article  MathSciNet  Google Scholar 

  21. Regis, R.: The calculus of simplex gradients. Optim. Lett. 9(5), 845–865 (2015)

    Article  MathSciNet  Google Scholar 

  22. Regis, R., Shoemaker, C.: Constrained global optimization of expensive black box functions using radial basis functions. J. Glob. Optim. 31, 153–171 (2005)

    Article  MathSciNet  Google Scholar 

  23. Rockafellar, R., Wets, R.: Variational Analysis. Fundamental Principles of Mathematical Sciences. Springer, Berlin (1998)

    Google Scholar 

  24. Schonlau, M., Welch, W., Jones, D.: Global versus local search in constrained optimization of computer models. Lecture Notes Monograph Series, pp. 11–25 (1998)

  25. Wang, G., Wei, Y., Qiao, S.: Generalized Inverses: Theory and Computations, vol. 53. Springer, Berlin (2018)

    Book  Google Scholar 

  26. Wild, S., Shoemaker, C.: Global convergence of radial basis function trust region derivative-free algorithms. SIAM J. Optim. 21(3), 761–781 (2011)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

Jarry-Bolduc would like to acknowledge UBC for the funding received through the University Graduate Fellowship award.

Funding

Jarry-Bolduc’s research is partially funded by the Natural Sciences and Engineering Research Council (NSERC) of Canada, Discover Grant #2018-03865.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gabriel Jarry–Bolduc.

Ethics declarations

Conflict of interest

The author declares no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jarry–Bolduc, G. Approximating the diagonal of a Hessian: which sample set of points should be used. Numer Algor 91, 1349–1361 (2022). https://doi.org/10.1007/s11075-022-01304-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-022-01304-z

Keywords

Navigation