Skip to main content

Gradient and diagonal Hessian approximations using quadratic interpolation models and aligned regular bases


This work investigates finite differences and the use of (diagonal) quadratic interpolation models to obtain approximations to the first and (non-mixed) second derivatives of a function. Here, it is shown that if a particular set of points is used in the interpolation model, then the solution to the associated linear system (i.e., approximations to the gradient and diagonal of the Hessian) can be obtained in \(\mathcal {O}(n)\) computations, which is the same cost as finite differences, and is a saving over the \(\mathcal {O}(n^{3})\) cost when solving a general unstructured linear system. Moreover, if the interpolation points are chosen in a particular way, then the gradient approximation is \(\mathcal {O}(h^{2})\) accurate, where h is related to the distance between the interpolation points. Numerical examples confirm the theoretical results.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3


  1. 1.

    The work in [18] appeared after this work.

  2. 2.

    Note that α and γ are functions of n, but for notational simplicity we avoid explicitly writing the dependence on n.

  3. 3.

    Note that μ and ω are functions of n, but for notational simplicity we avoid explicitly writing the dependence on n.


  1. 1.

    Audet, C., Hare, W.: Derivative-free and blackbox optimization. Springer Series in Operations Research and Financial Engineering, Springer (2017)

  2. 2.

    Bandeira, A.S., Scheinberg, K., Vicente, L.N.: Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization. Mathematical Programming Series B 134, 223–257 (2012)

    MathSciNet  Article  Google Scholar 

  3. 3.

    Baydin, A.G., Pearlmutter, B.A., Radul, A.A., Siskind, J.M.: Automatic differentiation in machine learning: a survey. J. Mach. Learn. Res. 18, 1–43 (2018). Editor: Léon Bottou

    MathSciNet  MATH  Google Scholar 

  4. 4.

    Berahas, A.S., Cao, L., Choromanskiy, K., Scheinberg, K.: A theoretical and empirical comparison of gradient approximations in derivative-free optimization. Tech. rep., Department of Industrial and Systems Engineering, Lehigh University, Bethlehem, PA, USA. arXiv:1905.01332v2 [math.OC] (2019)

  5. 5.

    Cocchi, G., Liuzzi, G., Papini, A., Sciandrone, M.: An implicit filtering algorithm for derivative-free multiobjective optimization with box constraints. Comput. Optim. Appl. 69(2), 267–296 (2018)

    MathSciNet  Article  Google Scholar 

  6. 6.

    Conn, A., Scheinberg, K., Vicente, L.: Introduction to Derivative-Free Optimization. MPS-SIAM Series on Optimization, Philadelphia (2009)

    Book  Google Scholar 

  7. 7.

    Conn, A., Scheinberg, K., Toint, P.: Recent progress in unconstrained nonlinear optimization without derivatives. Math. Program. 79, 397–414 (1997)

    MathSciNet  MATH  Google Scholar 

  8. 8.

    Conn, A.R., Scheinberg, K., Vicente, L.N.: Geometry of interpolation sets in derivative free optimization. Mathematical Programming Series B 111, 141–172 (2008)

    MathSciNet  Article  Google Scholar 

  9. 9.

    Conn, A.R., Scheinberg, K., Vicente, L.N.: Geometry of sample sets in derivative-free optimization: polynomial regression and underdetermined interpolation. IMA J. Numer. Anal. 28, 721–748 (2008)

    MathSciNet  Article  Google Scholar 

  10. 10.

    Conn, A.R., Toint, P.L.: An algorithm using quadratic interpolation for unconstrained derivative free optimization. In: Di Pillo, G., Giannessi, F. (eds.) Nonlinear Optimization and Applications, pp. 27–47. Springer US, Boston, MA (1996)

  11. 11.

    Coope, I., Price, C.: Frame-based methods for unconstrained optimization. J. Optim. Theory Appl. 107(2), 261–274 (2000)

    MathSciNet  Article  Google Scholar 

  12. 12.

    Coope, I.D., Tappenden, R.: Efficient calculation of regular simplex gradients. Comput. Optim. Appl. 72(3), 561–588 (2019).

    MathSciNet  Article  MATH  Google Scholar 

  13. 13.

    Fasano, G., Morales, J.L., Nocedal, J.: On the geometry phase in model-based algorithms for derivative-free optimization. Optimization Methods and Software 24(1), 145–154 (2009)

    MathSciNet  Article  Google Scholar 

  14. 14.

    Fazel, M., Ge, R., Kakade, S., Mesbahi, M.: Global convergence of policy gradient methods for the linear quadratic regulator. Proc. Mach. Learn. Res. (PMLR) 80, 1467–1476 (2018). International Conference on Machine Learning, 10–15, July 2018, Stockholmsmässan, Stockholm, Sweden

    Google Scholar 

  15. 15.

    Gilmore, P., Kelley, C.: An implicit filtering algorithm for optimization of functions with many local minima. SIAM J. Optim. 5(2), 269–285 (1995)

    MathSciNet  Article  Google Scholar 

  16. 16.

    Gilmore, P., Kelley, C.T., Miller, C.T., Williams, G.A.: Implicit filtering and optimal design problems. In: Borggaard, J., Burkardt, J., Gunzburger, M., Peterson, J. (eds.) Optimal Design and Control, pp. 159–176. Birkhäuser, Boston (1995)

  17. 17.

    Hare, W., Jaberipour, M.: Adaptive interpolation strategies in derivative-free optimization: a case study. Tech. Rep., University of British Colombia, Canada, and Amirkabir University of Technology, Iran. arXiv:1511.02794v1 [math.OC] (2015)

  18. 18.

    Hare, W., Jarry-Bolduc, G., Planiden, C.: Error bounds for overdetermined and underdetermined generalized centred simplex gradients. Tech. Rep., University of British Colombia, Canada, and University of Wollongong, Australia. arXiv:2006.00742v1 [math.NA] (2020)

  19. 19.

    Hoffmann, P.H.W.: A hitchhiker’s guide to automatic differentiation. Numerical Algorithms 72(3), 775–811 (2016).

    MathSciNet  Article  MATH  Google Scholar 

  20. 20.

    Jarry-Bolduc, G., Nadeau, P., Singh, S.: Uniform simplex of an arbitrary orientation. Optim. Lett. Published online 03, July 2019, (2019)

  21. 21.

    Maggiar, A., Wächter, A., Dolinskaya, I.S., Staum, J.: A derivative-free trust-region algorithm for the optimization of functions smoothed via gaussian convolution using adaptive multiple importance sampling. SIAM J. Optim. 28(2), 1478–1507 (2018)

    MathSciNet  Article  Google Scholar 

  22. 22.

    Margossian, C.C.: A review of automatic differentiation and its efficient implementation. Tech. Rep., Department of Statistics, Columbia University. arXiv:1811.05031v2 [cs.MS] (2019)

  23. 23.

    Nelder, J., Mead, R.: A simplex method for function minimization. Comput. J. 7(4), 308–313 (1965)

    MathSciNet  Article  Google Scholar 

  24. 24.

    Nesterov, Y., Spokoiny, V.: Random gradient-free minimization of convex functions. Found. Comput. Math. 17(2), 527–566 (2017)

    MathSciNet  Article  Google Scholar 

  25. 25.

    Nocedal, J., Wright, S.J.: Numerical Optimization, 2 ed. Springer Series in Operations Research, Springer (2006)

  26. 26.

    Spendley, W., Hext, G., Himsworth, F.: Sequential application of simplex designs in optimisation and evolutionary operation. Technometrics 4, 441–461 (1962)

    MathSciNet  Article  Google Scholar 

  27. 27.

    Wild, S.M., Shoemaker, C.: Global convergence of radial basis function trust-region algorithms for derivative-free optimization. SIAM Rev. 55 (2), 349–371 (2013)

    MathSciNet  Article  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to Rachael Tappenden.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Coope, I.D., Tappenden, R. Gradient and diagonal Hessian approximations using quadratic interpolation models and aligned regular bases. Numer Algor 88, 767–791 (2021).

Download citation


  • Derivative free optimization
  • Positive bases
  • Finite difference approximations
  • Interpolation models
  • Simplices
  • Simplex gradients
  • Preconditioning