Advertisement

Conditional optimization of a noisy function using a kriging metamodel

  • Diariétou SambakhéEmail author
  • Lauriane Rouan
  • Jean-Noël Bacro
  • Eric Gozé
Article
  • 44 Downloads

Abstract

The efficient global optimization method is popular for the global optimization of computer-intensive black-box functions. Extensions exist, either for the optimization of noisy functions, or for the conditional optimization of deterministic functions, i.e. the search for the values of a subset of parameters that optimize the function conditionally to the values taken by another subset, which are fixed. A metaphor for conditional optimization is the search for a crest line. No method has yet been developed for the conditional optimization of noisy functions: this is what we propose in this article. Testing this new method on test functions showed that, in the case of a high level of noise on the function, the PEQI criterion that we propose is better than the PEI criterion usually implemented in such a situation.

Keywords

Crest line Gaussian process Sampling criterion Sequential design Noisy function 

Notes

Acknowledgements

We are very grateful to the West Africa Agricultural Productivity Program (WAAPP) that funded this research as part of a Ph.D. thesis grant.

References

  1. 1.
    Branin, F.H., Hoo, S.K.: A method for finding multiple extrema of a function of n variables. In: Lootsma, F.A. (ed.) Numerical methods of Nonlinear Optimization, pp. 231–237. Academic Press, London (1972)Google Scholar
  2. 2.
    Brun, F., Wallach, D., Makowski, D., Jones, J.W.: Working with Dynamic Crop Models: Evaluation, Analysis, Parameterization, and Applications. Elsevier, Amsterdam (2006)Google Scholar
  3. 3.
    Calvin, J.M., Zalinski, A.: One-dimensional global optimization for observations with noise. Comput. Math. Appl. 50, 157–169 (2005)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Carnell, R.: lhs: Latin Hypercube Samples. http://CRAN.R-project.org/package=lhs, r package version 0.5, last access 15 (2011)
  5. 5.
    Cox, D.D., John, S.: SDO: a statistical method for global optimization. In: Alexandrov, N., Hussaini, M.Y. (eds.) Multidisciplinary Design Optimization: State of the Art, pp. 315–329. SIAM, Philadelphia (1997)Google Scholar
  6. 6.
    Cressie, N.: Statistics for Spatial Data, Revised edn. Wiley, New York (1993)zbMATHGoogle Scholar
  7. 7.
    Donald, C.M.: The breeding of crop ideotypes. Euphytica 17(3), 385–403 (1968)CrossRefGoogle Scholar
  8. 8.
    Ginsbourger, D., Baccou, J., Chevalier, C., Perales, F., Garland, N., Monerie, Y.: Bayesian adaptive reconstruction of profile optima and optimizers. SIAM/ASA J. Uncertain. Quantif. 2(1), 490–510 (2014).  https://doi.org/10.1137/130949555 MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Ginsbourger, D., Picheny, V., Roustant, O., Richet, Y.: A new look at kriging for the approximation of noisy simulators with tunable fidelity. 8th annual conference of ENBIS (2008)Google Scholar
  10. 10.
    Huang, D., Allen, T.T., Notz, W.I., Zeng, N.: Global optimization of stochastic black-box systems via sequential kriging meta-models. J. Glob. Optim. 34(3), 441–466 (2006)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Forrester, A.I.J., Keane, A.J., Bressloff, N.W.: Design and analysis of “Noisy” computer experiments. AIAA J. 44(10), 2331–2339 (2006).  https://doi.org/10.2514/1.20068 CrossRefGoogle Scholar
  12. 12.
    Johnson, M.E., Moore, L.M., Ylvisaker, D.: Minimax and maximin distance designs. J. Stat. Plan. Inference 26(2), 131–148 (1990)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998).  https://doi.org/10.1023/A:1008306431147 MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Jones, P.G., Thornton, P.K.: Generating downscaled weather data from a suite of climate models for agricultural modelling applications. Agric. Syst. 114, 1–5 (2013)CrossRefGoogle Scholar
  15. 15.
    Krige, D.G.: A statistical approach to some mine valuation and allied problems on the Witwatersrand: By DG Krige. Ph.D. thesis, University of the Witwatersrand (1951)Google Scholar
  16. 16.
    Kushner, H.J.: A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. J. Basic Eng. 86(1), 97 (1964).  https://doi.org/10.1115/1.3653121 CrossRefGoogle Scholar
  17. 17.
    Matheron, G.: Le krigeage universel. Les Cahiers du Centre de morphologie mathématique de Fontainebleau, vol. 1. École nationale supérieure des mines de Paris, Paris (1969)Google Scholar
  18. 18.
    McKay, M.D., Beckman, R.J., Conover, W.J.: Comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21(2), 239–245 (1979).  https://doi.org/10.1080/00401706.1979.10489755 MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Niederreiter, H.: Random number generation and Quasi-Monte Carlo methods. Society for Industrial and Applied Mathematics, Philadelphia (1992).  https://doi.org/10.1137/1.9781611970081 CrossRefzbMATHGoogle Scholar
  20. 20.
    Picheny, V., Ginsbourger, D.: Noisy kriging-based optimization methods: a unified implementation within the DiceOptim package. Comput. Stat. Data Anal. 71, 1035–1053 (2014).  https://doi.org/10.1016/j.csda.2013.03.018 MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Picheny, V., Ginsbourger, D., Richet, Y., Caplin, G.: Quantile-based optimization of noisy computer experiments with tunable precision. Technometrics 55(1), 2–13 (2013).  https://doi.org/10.1080/00401706.2012.707580 MathSciNetCrossRefGoogle Scholar
  22. 22.
    Picheny, V., Wagner, T., Ginsbourger, D.: A benchmark of kriging-based infill criteria for noisy optimization. Struct. Multidiscip. Optim. 48(3), 607–626 (2013).  https://doi.org/10.1007/s00158-013-0919-4 CrossRefGoogle Scholar
  23. 23.
    Rosenbrock, H.H.: An automatic method for finding the greatest or least value of a function. Comput. J. 3(3), 175–184 (1960).  https://doi.org/10.1093/comjnl/3.3.175 MathSciNetCrossRefGoogle Scholar
  24. 24.
    Roustant, O., Ginsbourger, D., Deville, Y.: DiceKriging, DiceOptim: two R packages for the analysis of computer experiments by kriging-based metamodelling and optimization. J. Stat. Softw. 51(1), 54 (2012)CrossRefGoogle Scholar
  25. 25.
    Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Stat. Sci. (1989)Google Scholar
  26. 26.
    Scott, W., Frazier, P., Powell, W.: The correlated knowledge gradient for simulation optimization of continuous parameters using gaussian process regression. SIAM J. Optim. 21(3), 996–1026 (2011).  https://doi.org/10.1137/100801275 MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Vazquez, E., Bect, J.: Convergence properties of the expected improvement algorithm with fixed mean and covariance functions. J. Stat. Plan. Inference 140(11), 3088–3095 (2010).  https://doi.org/10.1016/j.jspi.2010.04.018 MathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Williams, D.: Probability with Martingales. Cambridge University Press, Cambridge (1991)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Centre d’étude régional pour l’amélioration de l’adaptation à la sécheresseThiès Escale, ThièsSenegal
  2. 2.CIRAD, UMR AGAPMontpellierFrance
  3. 3.AGAP, Univ Montpellier, CIRAD, INRA, Montpellier SupAgroMontpellierFrance
  4. 4.IMAG, Univ Montpellier, CNRSMontpellierFrance
  5. 5.CIRAD, UPR AIDAMontpellierFrance
  6. 6.AIDA, Univ Montpellier, CIRADMontpellierFrance

Personalised recommendations