Structural and Multidisciplinary Optimization

, Volume 48, Issue 3, pp 607–626 | Cite as

A benchmark of kriging-based infill criteria for noisy optimization

Research Paper

Abstract

Responses of many real-world problems can only be evaluated perturbed by noise. In order to make an efficient optimization of these problems possible, intelligent optimization strategies successfully coping with noisy evaluations are required. In this article, a comprehensive review of existing kriging-based methods for the optimization of noisy functions is provided. In summary, ten methods for choosing the sequential samples are described using a unified formalism. They are compared on analytical benchmark problems, whereby the usual assumption of homoscedastic Gaussian noise made in the underlying models is meet. Different problem configurations (noise level, maximum number of observations, initial number of observations) and setups (covariance functions, budget, initial sample size) are considered. It is found that the choices of the initial sample size and the covariance function are not critical. The choice of the method, however, can result in significant differences in the performance. In particular, the three most intuitive criteria are found as poor alternatives. Although no criterion is found consistently more efficient than the others, two specialized methods appear more robust on average.

Keywords

Metamodeling Noise EGO 

Notes

Acknowledgments

The contributions of Tobias Wagner to this paper are based on investigations of the project D5 of the Collaborative Research Center SFB/TR TRR 30, which is kindly supported by the Deutsche Forschungsgemeinschaft (DFG).

References

  1. Ankenman B, Nelson BL, Staum J (2010) Stochastic kriging for simulation metamodeling. Oper Res 58(2):371–382. doi: 10.1287/opre.1090.0754 MathSciNetCrossRefMATHGoogle Scholar
  2. Bartz-Beielstein T, Preuß M (2006) Considerations of budget allocation for sequential parameter optimization (SPO). In: Paquete L et al (eds) Proceedings of the workshop on empirical methods for the analysis of algorithms (EMAA 2006), pp 35–40Google Scholar
  3. Beyer H, Sendhoff B (2007) Robust optimization-a comprehensive survey. Comput Methods Appl Mech Eng 196(33–34):3190–3218MathSciNetCrossRefMATHGoogle Scholar
  4. Biermann D, Weinert K, Wagner T (2008) Model-based optimization revisited: towards real-world processes. In: Michalewicz Z, Reynolds RG (eds) Proceedings of the 2008 IEEE congress on evolutionary computation (CEC 2008), 1–6 June, Hong Kong. IEEE Press, Piscataway, NJ, pp 2980–2987. doi: 10.1109/CEC.2008.4631199 Google Scholar
  5. Cox D, John S (1997) Sdo: a statistical method for global optimization. In: Multidisciplinary design optimization: state of the art, pp 315–329Google Scholar
  6. Cressie N (1992) Statistics for spatial data. Terra Nova 4(5):613–617CrossRefGoogle Scholar
  7. Dixon L, Szegö G (1978) Towards global optimisation 2, vol 2. North Holland, The NetherlandsGoogle Scholar
  8. Emmerich M (2005) Single- and multi-objective evolutionary design optimization assisted by gaussian random field metamodels. PhD thesis, Universität DortmundGoogle Scholar
  9. Fernex F, Heulers L, Jacquet O, Miss J, Richet Y (2005) The MORET 4B Monte Carlo code—new features to treat complex criticality systems. In: M&C international conference on mathematics and computation supercomputing, reactor physics and nuclear and biological application, Avignon, FranceGoogle Scholar
  10. Forrester A, Keane A, Bressloff N (2006) Design and analysis of noisy computer experiments. AIAA J 44(10):2331CrossRefGoogle Scholar
  11. Ginsbourger D, Picheny V, Roustant O, Richet Y (2008) A new look at Kriging for the approximation of noisy simulators with tunable fidelity. In: 8th ENBIS conference, Athens, GreeceGoogle Scholar
  12. Gramacy R, Lee H (2010) Optimization under unknown constraints. In: Bayesian statistics 6: Proceedings of the sixth Valencia international meeting. Oxford University Press, USAGoogle Scholar
  13. Hardy MA (1993) Regression with dummy variables, vol 93. Sage Publication Inc, Thousand Oaks, pp 91–93Google Scholar
  14. Huang D, Allen T, Notz W, Miller R (2006a) Sequential kriging optimization using multiple-fidelity evaluations. Struct Multidiscip Optim 32:369–382CrossRefGoogle Scholar
  15. Huang D, Allen T, Notz W, Zeng N (2006b) Global optimization of stochastic black-box systems via sequential kriging meta-models. J Glob Optim 34(3):441–466MathSciNetCrossRefMATHGoogle Scholar
  16. Humphrey D, Wilson J (2000) A revised simplex search procedure for stochastic simulation response surface optimization. INFORMS J Comput 12(4):272–283MathSciNetCrossRefMATHGoogle Scholar
  17. Iooss B, Lhuillier C, Jeanneau H (2002) Numerical simulation of transit-time ultrasonic flowmeters: uncertainties due to flow profile and fluid turbulence. Ultrasonics 40(9):1009–1015CrossRefGoogle Scholar
  18. Jin R, Chen W, Simpson T (2001) Comparative studies of metamodelling techniques under multiple modelling criteria. Struct Multidiscip Optim 23:1–13CrossRefGoogle Scholar
  19. Jones D (2001) A taxonomy of global optimization methods based on response surfaces. J Glob Optim 21(4):345–383CrossRefMATHGoogle Scholar
  20. Jones D, Schonlau M, Welch W (1998) Efficient global optimization of expensive black-box functions. J Glob Optim 13(4):455–492MathSciNetCrossRefMATHGoogle Scholar
  21. Kleijnen J (2007) Design and analysis of simulation experiments, vol 111. Springer Verlag, New YorkGoogle Scholar
  22. Krige D (1952) A statistical approach to some basic mine valuation problems on the witwatersrand. University of the WitwatersrandGoogle Scholar
  23. Li W, Huyse L, Padula S (2002) Robust airfoil optimization to achieve drag reduction over a range of mach numbers. Struct Multidiscip Optim 24(1):38–50CrossRefGoogle Scholar
  24. Matheron G (1969) Le krigeage universel. Cahiers du centre de morphologie mathématique 1Google Scholar
  25. Osborne M, Garnett R, Roberts S (2009) Gaussian processes for global optimization. In: 3rd international conference on learning and intelligent optimization (LION3), pp 1–15Google Scholar
  26. Picheny V, Ginsbourger D, Richet Y, Caplin G (2013) Quantile-based optimization of noisy computer experiments with tunable precision. Technometrics 55(1):2–13MathSciNetCrossRefGoogle Scholar
  27. Ponweiser W, Wagner T, Biermann D, Vincze M (2008) Multiobjective optimization on a limited amount of evaluations using model-assisted \(\mathcal {S}\)-metric selection. In: Rudolph G, Jansen T, Lucas S, Poloni C, Beume N (eds) Proceedings of the 10th international conference on parallel problem solving from nature (PPSN), 13–17 September, Dortmund. Lecture notes in computer science, vol 5199. Springer, Berlin, pp 784–794Google Scholar
  28. Rasmussen C, Williams C (2006) Gaussian processes for machine learning. MIT Press, Cambridge, MAMATHGoogle Scholar
  29. Roustant O, Ginsbourger D, Deville Y (2009) The DiceKriging package: kriging-based metamodeling and optimization for computer experiments. In: Book of abstract of the R user conferenceGoogle Scholar
  30. Sacks J, Welch W, Mitchell T, Wynn H (1989) Design and analysis of computer experiments. Stat Sci 4:409–423MathSciNetCrossRefMATHGoogle Scholar
  31. Sakata S, Ashida F (2009) Ns-kriging based microstructural optimization applied to minimizing stochastic variation of homogenized elasticity of fiber reinforced composites. Struct Multidiscip Optim 38:443–453CrossRefGoogle Scholar
  32. Sakata S, Ashida F, Zako M (2008) Microstructural design of composite materials using fixed-grid modeling and noise-resistant smoothed kriging-based approximate optimization. Struct Multidiscip Optim 36:273–287CrossRefGoogle Scholar
  33. Santner T, Williams B, Notz W (2003) The design and analysis of computer experiments. Springer, New YorkCrossRefMATHGoogle Scholar
  34. Sasena M (2002) Flexibility and efficiency enhancements for constrained global design optimization with kriging approximations. PhD thesis, University of MichiganGoogle Scholar
  35. Scott W, Frazier P, Powell W (2011) The correlated knowledge gradient for simulation optimization of continuous parameters using gaussian process regression. SIAM J Optim 21:996MathSciNetCrossRefMATHGoogle Scholar
  36. Sekhon J, Mebane W (1998) Genetic optimization using derivatives. Polit Anal 7(1):187CrossRefGoogle Scholar
  37. Simpson T, Booker A, Ghosh D, Giunta A, Koch P, Yang RJ (2004) Approximation methods in multidisciplinary analysis and optimization: a panel discussion. Struct Multidiscip Optim 27:302–313CrossRefGoogle Scholar
  38. Srinivas N, Krause A, Kakade S, Seeger M (2010) Gaussian process optimization in the bandit setting: no regret and experimental design. In: 27th international conference on machine learning (ICML 2010)Google Scholar
  39. Vazquez E, Villemonteix J, Sidorkiewicz M, Walter É (2008) Global optimization based on noisy evaluations: an empirical study of two statistical approaches. In: Journal of Physics: Conference Series, vol 135, p 012100Google Scholar
  40. Villemonteix J, Vazquez E, Walter E (2009) An informational approach to the global optimization of expensive-to-evaluate functions. J Glob Optim 44(4):509–534MathSciNetCrossRefMATHGoogle Scholar
  41. Wagner T, Wessing S (2012) On the effect of response transformations in sequential parameter optimization. Evol Comput 20(2):229–248. doi: 10.1162/EVCO_a_00061 CrossRefGoogle Scholar
  42. Yin J, Ng S, Ng K (2011) Kriging metamodel with modified nugget-effect: the heteroscedastic variance case. Comput Ind Eng 61(3):760–777CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Victor Picheny
    • 1
  • Tobias Wagner
    • 2
  • David Ginsbourger
    • 3
  1. 1.INRA—French National Institute for Agricultural ResearchCastanet-TolosanFrance
  2. 2.Institute of Machining Technology (ISF)Technische Universität DortmundDortmundGermany
  3. 3.Bern UniversityBernSwitzerland

Personalised recommendations