Skip to main content
Log in

An Overview of Gradient-Enhanced Metamodels with Applications

  • Original Paper
  • Published:
Archives of Computational Methods in Engineering Aims and scope Submit manuscript

Abstract

Metamodeling, the science of modeling functions observed at a finite number of points, benefits from all auxiliary information it can account for. Function gradients are a common auxiliary information and are useful for predicting functions with locally changing behaviors. This article is a review of the main metamodels that use function gradients in addition to function values. The goal of the article is to give the reader both an overview of the principles involved in gradient-enhanced metamodels while also providing insightful formulations. The following metamodels have gradient-enhanced versions in the literature and are reviewed here: classical, weighted and moving least squares, Shepard weighting functions, and the kernel-based methods that are radial basis functions, kriging and support vector machines. The methods are set in a common framework of linear combinations between a priori chosen functions and coefficients that depend on the observations. The characteristics common to all kernel-based approaches are underlined. A new \(\nu \)-GSVR metamodel which uses gradients is given. Numerical comparisons of the metamodels are carried out for approximating analytical test functions. The experiments are replicable, as they are performed with an opensource available toolbox. The results indicate that there is a trade-off between the better computing time of least squares methods and the larger versatility of kernel-based approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26

Similar content being viewed by others

Notes

  1. Recall that bold notations designate vectors. For example, \({\varvec{\xi }^{+}=\begin{bmatrix}\xi ^{+(1)}&\xi ^{+(2)}&\dots&\xi ^{+(n_{s})}\end{bmatrix}^\top}.\)

  2. As usual the boldface notation denotes the vector, \({ \varvec{\varepsilon }=[\varepsilon _0,\dots ,\varepsilon _{n_{p}}]^\top },\) which should not be mistaken with the errors in Sect. 4.

  3. In the case where there are not enough equations, pseudo-inverse recovers solution unicity by choosing, out of the infinite number of solutions to \({{{\bf F}}\varvec{\beta }= y_{g}},\) the \({\varvec{\beta }}\) of minimal euclidean norm.

Abbreviations

\(\nu \)-GSVR:

\(\nu \)-Version of the gradient-enhanced support vector regression (surrogate model)

\(\varepsilon \)-SVR:

\(\varepsilon \)-Version of the support vector regression (surrogate model)

\(\nu \)-SVR:

\(\nu \)-Version of the support vector regression (surrogate model)

\(\varepsilon _k\)-GSVR:

\(\varepsilon _k\)-Version of the gradient-enhanced support vector regression (surrogate model)

BLUP:

Best linear unbiased predictor

EGO:

Efficient global optimization [33]

GBK:

Gradient-based kriging (surrogate model)

GEK:

Gradient-enhanced kriging (surrogate model)

GEUK:

Gradient-enhanced universal kriging (surrogate model)

GKRG:

Gradient-enhanced cokriging (surrogate model, same as GBK, GEK and GEUK)

GLS:

Generalized least square regression (surrogate model)

GradLS:

Gradient-enhanced least square regression (surrogate model)

GRBF:

Gradient-enhanced radial basis function (surrogate model)

GRENAT:

Gradient-enhanced approximation toolbox (Matlab/Octave’s toolbox [12])

GSVR:

Gradient-enhanced support vector regression (surrogate model)

IDW:

Inverse distance weighting method also called Shepard weighting method (surrogate model)

IHS:

Improved hypercube sampling (sampling technique [103])

InOK:

Indirect gradient-enhanced ordinary kriging (surrogate model)

InRBF:

Indirect gradient-enhanced radial basis function (surrogate model)

KRG:

Kriging (surrogate model)

LOO:

Leave-one-out

LS:

Least square regression (surrogate model)

MLS:

Moving least square regression (surrogate model)

MSE:

Mean square error (quality criterion)

MultiDOE:

Multiple design of experiments (Matlab/Octave’s toolbox [133])

OK:

Ordinary kriging (surrogate model)

OCK:

Gradient-enhanced ordinary cokriging (surrogate model)

RBF:

Radial basis function (surrogate model)

RSM:

Response surface methodology

SBAO:

Surrogate-based analysis and optimization [18]

SVM:

Support vector machine

SVR:

Support vector regression (surrogate model)

WLS:

Weigthed least square regression (surrogate model)

MARS:

Multivariate adaptive regression splines (surrogate model)

ANN:

Artificial neural network (surrogate model)

LHS:

Latin hypercube sampling (sampling technique)

OA:

Orthogonal array (sampling technique)

UD:

Uniform design (sampling technique)

RAAE:

Relative average absolute error (quality criterion)

RMSE:

Root mean square error (quality criterion)

RMAE:

Relative maximal absolute error (quality criterion)

LOOCV:

Leave-one-out cross-validation

References

  1. Hughes GF (1968) On the mean accuracy of statistical pattern recognizers. IEEE Trans Inf Theory 14(1):55–63

    Article  Google Scholar 

  2. Lions J-L (1971) Optimal control of systems governed by partial differential equations. Springer, New York

    Book  MATH  Google Scholar 

  3. Cacuci DG (1981) Sensitivity theory for nonlinear systems. I. Nonlinear functional analysis approach. J Math Phys 22(12):2794–2802

    Article  MathSciNet  Google Scholar 

  4. Jameson A (1988) Aerodynamic design via control theory. J Sci Comput 3(3):233–260

    Article  MATH  MathSciNet  Google Scholar 

  5. Beda L, Korolev L, Sukkikh N, Frolova T (1959) Programs for automatic differentiation for the machine. Technical Report, BESM, Institute for Precise Mechanics and Computation Techniques, Academy of Science, Moscow

  6. Wengert RE (1964) A simple automatic derivative evaluation program. Commun ACM 7:463–464

    Article  MATH  Google Scholar 

  7. Griewank A (1989) On automatic differentiation. In: Mathematical programming: recent developments and applications. Kluwer, Amsterdam, pp 83–108

  8. Paoletti V, Fedi M, Italiano F, Florio G, Ialongo S (2016) Inversion of gravity gradient tensor data: does it provide better resolution? Geophys J Int 205(1):192–202

    Article  Google Scholar 

  9. Qin P, Huang D, Yuan Y, Geng M, Liu J (2016) Integrated gravity and gravity gradient 3D inversion using the non-linear conjugate gradient. J Appl Geophys 126:52–73

    Article  Google Scholar 

  10. Lorentz R (2000) Multivariate hermite interpolation by algebraic polynomials: a survey. J Comput Appl Math 122(12):167–201. Numerical analysis in the 20th century, vol. II: interpolation and extrapolation

  11. Lai M-J (2007) Multivariate splines for data fitting and approximation. In: Approximation theory XII: San Antonio, pp 210–228

  12. Laurent L (2016) GRENAT (Matlab/Octave Toolbox). https://bitbucket.org/luclaurent/grenat

  13. Box GEP, Wilson K (1951) On the experimental attainment of optimum conditions. J R Stat Soc B 13(1):1–45

    MathSciNet  MATH  Google Scholar 

  14. Simpson TW, Mistree F, Korte JJ, Mauery TM (1998) Comparison of response surface and kriging models for multidisciplinary design optimization. In: AIAA paper 98-4758. 7th AIAA/USAF/NASA/ISSMO symposium on multidisciplinary analysis and optimization, no 98–4755

  15. Giunta AA, Watson LT (1998) A comparison of approximation modeling techniques—polynomial versus interpolating models. In: 7th AIAA/USAF/NASA/ISSMO symposium on multidisciplinary analysis and optimization, no AIAA-98-4758, American Institute of Aeronautics and Astronautics

  16. Jin R, Chen W, Simpson T (2000) Comparative studies of metamodeling techniques under multiple modeling criteria. In: 8th symposium on multidisciplinary analysis and optimization, no AIAA-2000-4801, American Institute of Aeronautics and Astronautics

  17. Varadarajan S, Chen W, Pelka CJ (2000) Robust concept exploration of propulsion systems with enhanced model approximation capabilities. Eng Optim 32(3):309–334

    Article  Google Scholar 

  18. Queipo NV, Haftka RT, Shyy W, Goel T, Vaidyanathan R, Tucker PK (2005) Surrogate-based analysis and optimization. Prog Aerosp Sci 41(1):1–28

    Article  Google Scholar 

  19. Peter J, Marcelet M, Burguburu S, Pediroda V (2007) Comparison of surrogate models for the actual global optimization of a 2d turbomachinery flow. In: Proceedings of the 7th WSEAS international conference on simulation, modelling and optimization, pp 46–51. World Scientific and Engineering Academy and Society (WSEAS)

  20. Marcelet M (2008) Etude et mise en oeuvre d’une méthode d’optimisation de forme couplant simulation numérique en aérodynamique et en calcul de structure. PhD thesis, École Nationale Supérieure d’Arts et Métiers

  21. Forrester AIJ, Sóbester A, Keane AJ (2008) Engineering design via surrogate modelling: a practical guide, 1st edn. Wiley, Chichester

    Book  Google Scholar 

  22. Kim B-S, Lee Y-B, Choi D-H (2009) Comparison study on the accuracy of metamodeling technique for non-convex functions. J Mech Sci Technol 23:1175–1181

    Article  Google Scholar 

  23. Zhao D, Xue D (2010) A comparative study of metamodeling methods considering sample quality merits. Struct Multidiscip Optim 42(6):923–938

    Article  Google Scholar 

  24. McKay MD, Beckman RJ, Conover WJ (1979) Comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21(2):239–245

    MathSciNet  MATH  Google Scholar 

  25. Santner TJ, Williams BJ, Notz WI (2003) The design and analysis of computer experiments. Springer, New York

    Book  MATH  Google Scholar 

  26. Fang KT, Li R, Sudjianto A (2005) Design and modeling for computer experiments. Chapman & Hall, Boca Raton

    Book  MATH  Google Scholar 

  27. Franco J (2008) Planification d’expériences numériques en phase exploratoire pour la simulation des phénoménes complexes. PhD thesis, École Nationale Supérieure des Mines de Saint-étienne

  28. Schonlau M (1997) Computer experiments and global optimization. PhD thesis, University of Waterloo

  29. Sóbester A, Leary SJ, Keane AJ (2005) On the design of optimization strategies based on global response surface approximation models. J Glob Optim 33(1):31–59

    Article  MathSciNet  MATH  Google Scholar 

  30. Alexandrov NM, Dennis JE, Lewis RM, Torczon V (1998) A trust-region framework for managing the use of approximation models in optimization. Struct Optim 15(1):16–23

    Article  Google Scholar 

  31. Sasena MJ (2002) Flexibility and efficiency enhancements for constrained global design optimization with kriging approximations. PhD thesis, University of Michigan

  32. Watson AG, Barnes RJ (1995) Infill sampling criteria to locate extremes. Math Geol 27(5):589–608

    Article  Google Scholar 

  33. Jones DR, Schonlau M, Welch WJ (1998) Efficient global optimization of expensive black-box functions. J Glob Optim 13(4):455–492

    Article  MathSciNet  MATH  Google Scholar 

  34. Matheron G (1970) La théorie des variables régionalisées et ses applications. Les Cahiers du Centre de Morphologie Mathématique de Fontainebleau, école Nationale des Mines de Paris, vol Fascicule 5

  35. Wackernagel H (1995) Multivariate geostatistics: an introduction with applications. Springer, Berlin

    Book  MATH  Google Scholar 

  36. Isaaks EH, Srivastava RM (1989) An introduction to applied geostatistics. Oxford University Press, New York

    Google Scholar 

  37. Hoef JM, Cressie NAC (1993) Multivariable spatial prediction. Math Geol 25(2):219–240

    Article  MathSciNet  MATH  Google Scholar 

  38. Goovaerts P (1997) Geostatistics for natural resources evaluation. Oxford University Press, New York

    Google Scholar 

  39. Morris MD, Mitchell TJ, Ylvisaker D (1993) Bayesian design and analysis of computer experiments: use of derivatives in surface prediction. Technometrics 35(3):243–255

    Article  MathSciNet  MATH  Google Scholar 

  40. Koehler JR, Owen AB (1996) Computer experiments. Handb Stat 13:261–308

    Article  MathSciNet  MATH  Google Scholar 

  41. Lewis RM (1998) Using sensitivity information in the construction of kriging models for design optimization. In: Proceedings of the 7th AIAA/USAF/NASA/ISSMO multidisciplinary analysis & optimization symposium, Saint Louis, Missouri

  42. Arnaud M, Emery X (2000) Estimation et interpolation spatiale. Hermes Science Publications, Paris

    Google Scholar 

  43. Chung H-S, Alonso JJ (2002) Using gradients to construct cokriging approximation models for high-dimensional design optimization problems. In: 40th AIAA aerospace sciences meeting & exhibit, no AIAA-2002-0317, American Institute of Aeronautics and Astronautics

  44. Chung H-S, Alonso JJ (2002) Design of a low-boom supersonic business jet using cokriging approximation models. In: 9th AIAA/ISSMO symposium on multidisciplinary analysis and optimization, no AIAA-2002-5598, American Institute of Aeronautics and Astronautics

  45. Leary SJ, Bhaskar A, Keane AJ (2004) A derivative based surrogate model for approximating and optimizing the output of an expensive computer simulation. J Glob Optim 30(1):39–58

    Article  MathSciNet  MATH  Google Scholar 

  46. Leary SJ, Bhaskar A, Keane AJ (2004) Global approximation and optimization using adjoint computational fluid dynamics codes. AIAA J 42(3):631–641

    Article  Google Scholar 

  47. Laurenceau J, Sagaut P (2008) Building efficient response surfaces of aerodynamic functions with kriging and cokriging. AIAA J 46(2):498–507

    Article  Google Scholar 

  48. Laurenceau J, Meaux M (2008) Comparison of gradient and response surface based optimization frameworks using adjoint method. In: 49th AIAA/ASME/ASCE/AHS/ASC structures, structural dynamics, and materials conference, 16th AIAA/ASME/AHS adaptive structures conference, 10th AIAA non-deterministic approaches conference, 9th AIAA Gossamer Spacecraft Forum, 4th AIAA multidisciplinary design optimization specialists conference, no AIAA-2008-1889, American Institute of Aeronautics and Astronautics

  49. Dwight R, Han Z-H (2009) Efficient uncertainty quantification using gradient-enhanced kriging. In: 50th AIAA/ASME/ASCE/AHS/ASC structures, structural dynamics, and materials conference, no AIAA-2009-2276, American Institute of Aeronautics and Astronautics

  50. Xuan Y, Xiang J, Zhang W, Zhang Y (2009) Gradient-based kriging approximate model and its application research to optimization design. Sci China E 52(4):1117–1124

    Article  MATH  Google Scholar 

  51. Lockwood BA, Mavriplis DJ (2010) Parameter sensitivity analysis for hypersonic viscous flow using a discrete adjoint approach. AIAA Paper 447

  52. March A, Willcox K, Wang Q (2010) Gradient-based multifidelity optimisation for aircraft design using Bayesian model calibration. Aeronaut J 115(1174):729–738

    Article  Google Scholar 

  53. Yamazaki W, Rumpfkeil M, Mavriplis D (2010) Design optimization utilizing gradient/hessian enhanced surrogate model. In: 28th AIAA applied aerodynamics conference, no AIAA-2010-4363, American Institute of Aeronautics and Astronautics

  54. Bompard M (2011) Modéles de substitution pour l’optimisation globale de forme en aérodynamique et méthode locale sans paramétrisation. PhD thesis, Université de Nice Sophia Antipolis

  55. Rumpfkeil MP, Yamazaki W, Mavriplis DJ (2011) A dynamic sampling method for kriging and cokriging surrogate models. In: 49th AIAA Aerospace Sciences Meeting including the New Horizons Forum and Aerospace Exposition, no AIAA-2011-883, American Institute of Aeronautics and Astronautics

  56. Laurent L, Boucard P-A, Soulier B (2012) Gradient-enhanced metamodels and multiparametric strategies for designing structural assemblies. In: Topping BHV (ed) Proceedings of the eleventh international conference on computational structures technology, 4–7 September, no Paper 230, Civil-Comp Press, Stirlingshire, UK

  57. Laurent L (2013) Stratégie multiparamétrique et métamodèles pour l’optimisation multiniveaux de structures. PhD thesis, École Normale Supérieure de Cachan, 61, avenue du Président Wilson, 94230 Cachan

  58. Laurent L, Boucard P-A, Soulier B (2013) Combining multiparametric strategy and gradient-based surrogate model for optimizing structure assemblies. In: ISSMO (ed)10th world congress on structural and multidisciplinary optimization, Orlando, Florida, USA, 19–24 May

  59. Laurent L (2014) Global optimisation on assembly problems using gradient-based surrogate model and multiparametric strategy. In: PhD Olympiad ECCOMAS, 11th World Congress on Computational Mechanics, Barcelona, Spain, 20–25 July 2014

  60. Han Z-H, Görtz S, Zimmermann R (2013) Improving variable-fidelity surrogate modeling via gradient-enhanced kriging and a generalized hybrid bridge function. Aerosp Sci Technol 25(1):177–189

    Article  Google Scholar 

  61. Zimmermann R (2013) On the maximum likelihood training of gradient-enhanced spatial gaussian processes. SIAM J Sci Comput 35(6):A2554–A2574

    Article  MathSciNet  MATH  Google Scholar 

  62. Ulaganathan S, Couckuyt I, Dhaene T, Degroote J, Laermans E (2016) Performance study of gradient-enhanced kriging. Eng Comput 32(1):15–34

    Article  Google Scholar 

  63. Ulaganathan S, Couckuyt I, Ferranti F, Laermans E, Dhaene T (2015) Performance study of multi-fidelity gradient enhanced kriging. Struct Multidiscip Optim 51(5):1017–1033

    Article  Google Scholar 

  64. Zongmin W (1992) Hermite-Birkhoff interpolation of scattered data by radial basis functions. Approx Theory Appl 8(2):1–10

    MathSciNet  MATH  Google Scholar 

  65. Kampolis IC, Karangelos EI, Giannakoglou KC (2004) Gradient-assisted radial basis function networks: theory and applications. Appl Math Model 28(2):197–209

    Article  MATH  Google Scholar 

  66. Giannakoglou KC, Papadimitriou DI, Kampolis IC (2006) Aerodynamic shape design using evolutionary algorithms and new gradient-assisted metamodels. Comput Methods Appl Mech Eng 195(44–47):6312–6329

    Article  MATH  Google Scholar 

  67. Ong Y-S, Lum K, Nair PB (2008) Hybrid evolutionary algorithm with hermite radial basis function interpolants for computationally expensive adjoint solvers. Comput Optim Appl 39(1):97–119

    Article  MathSciNet  MATH  Google Scholar 

  68. Lázaro M, Santamaría I, Pérez-Cruz F, Artés-Rodríguez A (2005) Support vector regression for the simultaneous learning of a multivariate function and its derivatives. Neurocomputing 69(1–3):42–61

    Article  Google Scholar 

  69. Jayadeva, Khemchandani R, Chandra S (2006) Regularized least squares twin svr for the simultaneous learning of a function and its derivative. In: Neural networks, 2006. IJCNN’06. International joint conference onIJCNN ’06

  70. Bloch G, Lauer F, Colin G, Chamaillard Y (2008) Support vector regression from simulation data and few experimental samples. Inf Sci 178(20):3813–3827. Special issue on industrial applications of neural networks—10th engineering applications of neural networks 2007

  71. Jayadeva, Khemchandani R, Chandra S (2008) Regularized least squares support vector regression for the simultaneous learning of a function and its derivatives. Inf Sci 178(17):3402–3414

    Article  MATH  Google Scholar 

  72. Lauer F, Bloch G (2008) Incorporating prior knowledge in support vector regression. Mach Learn 70(1):89–118

    Article  Google Scholar 

  73. Khemchandani R, Karpatne A, Chandra S (2013) Twin support vector regression for the simultaneous learning of a function and its derivatives. Int J Mach Learn Cybern 4(1):51–63

    Article  Google Scholar 

  74. Renka RJ (1988) Multivariate interpolation of large sets of scattered data. ACM Trans Math Softw (TOMS) 14(2):139–148

    Article  MathSciNet  MATH  Google Scholar 

  75. Lauridsen S, Vitali R, van KeulenF, Haftka RT, Madsen JI (2002) Response surface approximation using gradient information, vol 4, p 5. In: Cheng et al (ed) Proceedings of 4th world congress on structural and multidisciplinary optimization, Dalian China, 4–8 June 2001

  76. Kim C, Wang S, Choi KK (2005) Efficient response surface modeling by using moving least-squares method and sensitivity. AIAA J 43(11):2404–2411

    Article  Google Scholar 

  77. Breitkopf P, Naceur H, Rassineux A, Villon P (2005) Moving least squares response surface approximation: formulation and metal forming applications. Comput Struct 83(17–18):1411–1428. Advances in meshfree methods

  78. van Keulen F, Liu B, Haftka RT (2000) Noise and discontinuity issues in response surfaces based on functions and derivatives. In: 41st structures, structural dynamics, and materials conference and exhibit, no AIAA-00-1363, American Institute of Aeronautics and Astronautics

  79. Vervenne K, van Keulen F (2002) An alternative approach to response surface building using gradient information. In 43rd AIAA/ASME/ASCE/AHS/ASC structures, structural dynamics, and materials conference, no AIAA-2002-1584, American Institute of Aeronautics and Astronautics

  80. van Keulen F, Vervenne K (2004) Gradient-enhanced response surface building. Struct Multidiscip Optim 27(5):337–351

    Google Scholar 

  81. Liu W (2003) Development of gradient-enhanced kriging approximations for multidisciplinary design optimization. PhD thesis, University of Notre Dame, Indiana

  82. Myers RH, Montgomery DC (1995) Response surface methodology: process and product optimization using designed experiments. Wiley, New York

    MATH  Google Scholar 

  83. Mazja V (1985) Sobolev spaces. Springer, New York

    Book  Google Scholar 

  84. Runge C (1901) Über empirische Funktionen und die Interpolation zwischen äquidistanten Ordinaten. Zeitschrift für Mathematik und Physik 46:224–243

    MATH  Google Scholar 

  85. Haftka RT (1993) Semi-analytical static nonlinear structural sensitivity analysis. AIAA J 31(7):1307–1312

    Article  MathSciNet  MATH  Google Scholar 

  86. Rasmussen CE (2006) Gaussian processes for machine learning. Adaptive computation and machine learning. MIT Press, Cambridge

    MATH  Google Scholar 

  87. Stander N (2001) The successive response surface method applied to sheet-metal forming. In: Proceedings of the first MIT conference on computational fluid and solid mechanics, pp 481–485, 12–15 June 2001

  88. Lancaster P, Salkauskas K (1981) Surfaces generated by moving least squares methods. Math Comput 37(155):141–158

    Article  MathSciNet  MATH  Google Scholar 

  89. Zhou L, Zheng WX (2006) Moving least square ritz method for vibration analysis of plates. J Sound Vib 290(3–5):968–990

    Article  MATH  Google Scholar 

  90. Häussler-Combe U, Korn C (1998) An adaptive approach with the element-free-Galerkin method. Comput Methods Appl Mech Eng 162(1–4):203–222

    Article  MathSciNet  MATH  Google Scholar 

  91. Shepard D (1968) A two-dimensional interpolation function for irregularly-spaced data. In: Proceedings of the 1968 23rd ACM national conference, pp 517–524. ACM

  92. Mercer J (1909) Functions of positive and negative type, and their connection with the theory of integral equations. Philos Trans R Soc Lond A 209(441–458):415–446

    Article  MATH  Google Scholar 

  93. Genton MG (2001) Classes of kernels for machine learning: a statistics perspective. J Mach Learn Res 2:299–312

    MathSciNet  MATH  Google Scholar 

  94. Laurent L, Boucard P-A, Soulier B (2013) Generation of a cokriging metamodel using a multiparametric strategy. Comput Mech 51:151–169

    Article  MathSciNet  MATH  Google Scholar 

  95. Laurent L, Boucard P-A, Soulier B (2013) A dedicated multiparametric strategy for the fast construction of a cokriging metamodel. Comput Struct 124:61–73. Special Issue: KRETA

  96. Stein ML (1999) Interpolation of spatial data: some theory for kriging. Springer, New York

    Book  MATH  Google Scholar 

  97. Matérn B (1960) Spatial Variation (Lecture NotesStatist. 36). Springer, Berlin

    Google Scholar 

  98. Abramowitz M, Stegun I (1964) Handbook of mathematical functions with formulas, graphs, and mathematical tables. In: Applied mathematics series, vol 55, no 1972. U.S. Government Printing Office

  99. Lockwood BA, Anitescu M (2012) Gradient-enhanced universal kriging for uncertainty propagation. Nucl Sci Eng 170:168–195

    Article  Google Scholar 

  100. Hardy RL (1971) Multiquadric equations of topography and other irregular surfaces. J Geophys Res 76(8):1905–1915

    Article  Google Scholar 

  101. Powell MJ (1981) Approximation theory and methods. Cambridge University Press, Cambridge

    Book  MATH  Google Scholar 

  102. Broomhead D, Lowe D (1988) Multivariable functional interpolation and adaptive networks. Complex Syst 2:321–355

    MathSciNet  MATH  Google Scholar 

  103. Beachkofski B, Grandhi R (2002) Improved distributed hypercube sampling. In: 43rd AIAA/ASME/ASCE/AHS/ASC structures, structural dynamics, and materials conference, no AIAA 2002–1274

  104. Schaback R (2007) A practical guide to radial basis functions. Book for teaching (Preprint)

  105. Stone M (1974) Cross-validatory choice and assessment of statistical predictions. J R Stat Soc B 36:111–147

    MathSciNet  MATH  Google Scholar 

  106. Geisser S (1975) The predictive sample reuse method with applications. J Am Stat Assoc 70(350):320–328

    Article  MATH  Google Scholar 

  107. Bompard M, Peter J, Desideri J et al (2010) Surrogate models based on function and derivative values for aerodynamic global optimization. In: Proceedings of the V European conference on computational fluid dynamics ECCOMAS CFD 2010

  108. Rippa S (1999) An algorithm for selecting a good value for the parameter c in radial basis function interpolation. Adv Comput Math 11:193–210

    Article  MathSciNet  MATH  Google Scholar 

  109. Soulier B, Courrier N, Laurent L, Boucard P-A (2015) Métamodèles à gradients et multiniveaux de fidélité pour l’optimisation d’assemblages. In: 12ème Coloque National en Calcul des Structures, Giens, France, 12-22 mai, CSMA

  110. Ulaganathan S, Couckuyt I, Dhaene T, Degroote J, Laermans E (2016) High dimensional kriging metamodelling utilising gradient information. Appl Math Model 40:5256–5270

    Article  Google Scholar 

  111. Chauvet P (1999) Aide mémoire de la géostatistique linéaire. Cahiers de Géostatistique, école Nationale Supérieur des Mines de Paris, Centre de Géostatistique, Fontainebleau, vol Fascicule 2

  112. Mardia K, Marshall R (1984) Maximum likelihood estimation of models for residual covariance in spatial regression. Biometrika 71(1):135

    Article  MathSciNet  MATH  Google Scholar 

  113. Warnes J, Ripley B (1987) Problems with likelihood estimation of covariance functions of spatial gaussian processes. Biometrika 74(3):640

    Article  MathSciNet  MATH  Google Scholar 

  114. Toal DJ, Bressloff N, Keane A, Holden C (2011) The development of a hybridized particle swarm for kriging hyperparameter tuning. Eng Optim 43:675–699

    Article  Google Scholar 

  115. Toal DJJ, Forrester AIJ, Bressloff NW, Keane AJ, Holden C (2009) An adjoint for likelihood maximization. Proc R Soc Lond A 465(2111):3267–3287

    Article  MathSciNet  MATH  Google Scholar 

  116. Laurent L (2013) Multilevel optimisation of structures using a multiparametric strategy and metamodels. PhD thesis, École Normale Supérieure de Cachan, 61, avenue du Président Wilson, 94230 Cachan

  117. Vapnik VN (1995) The nature of statistical learning theory. Springer, New York

    Book  MATH  Google Scholar 

  118. Vapnik VN, Chervonenkis AY (1974) Theory of pattern recognition. Nauka, Moscow (in Russian)

    MATH  Google Scholar 

  119. Smola AJ, Murata N, Schölkopf B, Müller K-R (1998) Asymptotically optimal choice of ε-loss for support vector machines. In: Niklasson, L.F.: Proceedings of the 8th international conference on artificial neural networks 1998. Vol 1: Skövde, Sweden, 2–4 September 1998, ICANN 98, pp 105–110. Springer

  120. Smola AJ, Schölkopf B (2004) A tutorial on support vector regression. Stat Comput 14(3):199–222

    Article  MathSciNet  Google Scholar 

  121. Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines: and other kernel-based learning methods. Cambridge University Press, New York

    Book  MATH  Google Scholar 

  122. Platt J (1998) Sequential minimal optimization: a fast algorithm for training support vector machines. Technical Report MSR-TR-98-14, Microsoft Research

  123. Schölkopf B, Bartlett PL, Smola AJ, Williamson R (1999) Shrinking the tube: a new support vector regression algorithm. In: Advances in neural information processing systems, vol 11, (Cambridge, MA, USA), pp. 330–336, Max-Planck-Gesellschaft, MIT Press

  124. Schölkopf B, Smola AJ, Williamson RC, Bartlett PL (2000) New support vector algorithms. Neural Comput 12:1207–1245

    Article  Google Scholar 

  125. Chang C-C, Lin C-J (2002) Training v-support vector regression: theory and algorithms. Neural Comput 14:1959–1977

    Article  MATH  Google Scholar 

  126. Cherkassky V, Ma Y (2002) Artificial neural networks—ICANN 2002: international conference Madrid, Spain, August 28–30, 2002 proceedings, chapter. Selection of meta-parameters for support vector regression, pp 687–693. Springer, Berlin Heidelberg

  127. Vapnik VN (1998) Statistical learning theory. Wiley, New York

    MATH  Google Scholar 

  128. Vapnik VN, Chapelle O (2000) Bounds on error expectation for support vector machines. Neural Comput 12:2013–2036

    Article  Google Scholar 

  129. Chapelle O, Vapnik VN, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. Mach Learn 46(1–3):131–159

    Article  MATH  Google Scholar 

  130. Chang M-W, Lin C-J (2005) Leave-one-out bounds for support vector regression model selection. Neural Comput 17:1188–1222

    Article  MATH  Google Scholar 

  131. Laurent L, Boucard P-A, Soulier B (2011) Fast multilevel optimization using a multiparametric strategy and a cokriging metamodel. In: Tsompanakis Y, Tolpping BHV (eds) Proceedings of the second international conference on soft computing technology in civil, structural and environmental engineering, 6–9 September, no Paper 50, Civil-Comp Press, Stirlingshire, UK

  132. Laurent L, Soulier B, Le Riche R, Boucard P-A (2016) On the use of gradient-enhanced metamodels for global approximation and global optimization. In: VII European congress on computational methods in applied sciences and engineering, Hersonissos, Crete, Greece, June 5-10

  133. Laurent L (2016) MultiDOE (Matlab/Octave Toolbox). https://bitbucket.org/luclaurent/multidoe

  134. Couckuyt I, Dhaene T, Demeester P (2014) ooDACE toolbox: a flexible object-oriented kriging implementation. J Mach Learn Res 15(1):3183–3186

    MATH  Google Scholar 

  135. Ulaganathan S, Couckuyt I, Deschrijver D, Laermans E, Dhaene T (2015) A matlab toolbox for kriging metamodelling. Procedia Comput Sci 51:2708–2713

    Article  Google Scholar 

  136. Forrester AIJ, Sóbester A, Keane AJ (2006) Optimization with missing data. Proc R Soc A 462(2067):935–945

    Article  MATH  Google Scholar 

  137. Fritz J, Neuweiler I, Nowak W (2009) Application of fft-based algorithms for large-scale universal kriging problems. Math Geo 41(5):509–533

    Article  MATH  Google Scholar 

  138. Hensman J, Durrande N, Solin A (2016) Variational Fourier features for Gaussian processes. ArXiv e-prints

  139. Bouhlel MA, Bartoli N, Otsmane A, Morlier J (2016) Improving kriging surrogates of high-dimensional design models by partial least squares dimension reduction. Struct Multidiscip Optim 53(5):935–952

    Article  MathSciNet  MATH  Google Scholar 

  140. Bouhlel MA, Bartoli N, Otsmane A, Morlier J (2016) An improved approach for estimating the hyperparameters of the kriging model for high-dimensional problems through the partial least squares method. Math Prob Eng 2016:11

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luc Laurent.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Laurent, L., Le Riche, R., Soulier, B. et al. An Overview of Gradient-Enhanced Metamodels with Applications. Arch Computat Methods Eng 26, 61–106 (2019). https://doi.org/10.1007/s11831-017-9226-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11831-017-9226-3

Navigation