Advertisement

Journal of Global Optimization

, Volume 43, Issue 2–3, pp 373–389 | Cite as

Global optimization of expensive-to-evaluate functions: an empirical comparison of two sampling criteria

  • Julien Villemonteix
  • Emmanuel Vazquez
  • Maryan Sidorkiewicz
  • Eric Walter
Article

Abstract

In many global optimization problems motivated by engineering applications, the number of function evaluations is severely limited by time or cost. To ensure that each of these evaluations usefully contributes to the localization of good candidates for the role of global minimizer, a stochastic model of the function can be built to conduct a sequential choice of evaluation points. Based on Gaussian processes and Kriging, the authors have recently introduced the informational approach to global optimization (IAGO) which provides a one-step optimal choice of evaluation points in terms of reduction of uncertainty on the location of the minimizers. To do so, the probability density of the minimizers is approximated using conditional simulations of the Gaussian process model behind Kriging. In this paper, an empirical comparison between the underlying sampling criterion called conditional minimizer entropy (CME) and the standard expected improvement sampling criterion (EI) is presented. Classical test functions are used as well as sample paths of the Gaussian model and an industrial application. They show the interest of the CME sampling criterion in terms of evaluation savings.

Keywords

Expected improvement Global optimization Kriging 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Ackley D. (1987). A Connectionist Machine for Genetic Hill-climbing. Kluwer Academic Publishers, Norwell Google Scholar
  2. 2.
    Barton R. (1984). Minimization algorithms for functions with random noise. Am. J. Math. Manage. Sci. 4: 109–138 Google Scholar
  3. 3.
    Branin F. (1972). Widely convergent methods for finding multiple solutions of simultaneous nonlinear equations. IBM J. Res. Dev. 16: 504–522 CrossRefGoogle Scholar
  4. 4.
    Chilès J. and Delfiner P. (1999). Geostatistics, Modeling Spatial Uncertainty. Wiley, New York Google Scholar
  5. 5.
    Cover T.M. and Thomas A.J. (1991). Elements of Information Theory. Wiley, New York Google Scholar
  6. 6.
    Geman, D., Jedynak, B.: An active testing model for tracking roads in satellite images. Tech. Rep. 2757, Institut National de Recherche en Informatique et en Automatique (INRIA) (1995)Google Scholar
  7. 7.
    Hartman J. (1973). Some experiments in global optimization. Nav. Res. Logist. Q. 20: 569–576 CrossRefGoogle Scholar
  8. 8.
    Huang D., Allen T., Notz W. and Zeng N. (2006). Global optimization of stochastic black-box systems via sequential kriging meta-models. J. Glob. Optim. 34: 441–466 CrossRefGoogle Scholar
  9. 9.
    Jones D. (2001). A taxonomy of global optimization methods based on response surfaces. J. Glob. Optim. 21: 345–383 CrossRefGoogle Scholar
  10. 10.
    Jones D., Schonlau M. and William J. (1998). Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13: 455–492 CrossRefGoogle Scholar
  11. 11.
    Knowles J. (2003). Parego: a hybrid algorithm with on-line landscape approxiamtion for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 7(2): 100–116 CrossRefGoogle Scholar
  12. 12.
    Knowles, J., Thiele, L., Zitzler, E.: A tutorial on the performance assessment of stochastive multiobjective optimizers. Tech. Rep. 214, Computer Engineering and Networks Laboratory, ETH Zurich (2006)Google Scholar
  13. 13.
    Lumley J. (1999). Engines: An Introduction. Cambridge University Press, Cambridge Google Scholar
  14. 14.
    Mockus J. (1989). Bayesian Approach to Global Optimization: Theory and Applications. Kluwer Academic Publishers, Dordrecht Google Scholar
  15. 15.
    Myers R. and Montgomery D. (2002). Response Surface Methodology: Process and Product Optimization Using Designed Experiments. Wiley-Interscience, New York Google Scholar
  16. 16.
    Perttunen, C.: A computational geometric approach to feasible region division in constrained global optimization. In: Proceedings of the 1991 IEEE Conference on Systems, Man, and Cybernetics, vol. 1, pp. 585–590 (1991)Google Scholar
  17. 17.
    Sasena M., Papalambros P. and Goovaerts P. (2002). Exploration of metamodeling sampling criteria for constrained global optimization. Eng. Opt. 34: 263–278 CrossRefGoogle Scholar
  18. 18.
    Stein M. (1999). Interpolation of Spatial Data: Some Theory for Kriging. Springer, New-York Google Scholar
  19. 19.
    Vazquez, E., Walter, É.: Estimating derivatives and integrals with Kriging. In: Proceedings of the Joint 44th IEEE Conference on Decision and European Control Conference, pp. 8156–8161 (2005)Google Scholar
  20. 20.
    Villemonteix, J., Vazquez, E., Walter, É.: An informational approach to the global optimization of expensive-to-evaluate functions. J. Glob. Optim. (2006) (Submitted)Google Scholar
  21. 21.
    Wahba G. (1998). Support vector machines, reproducing kernel Hilbert spaces and randomized GACV. In: Schölkopf, B., Burges, C., and Smola, A. (eds) Advances in Kernel Methods—Support Vector Learning, vol. 6, pp 69–87. MIT Press, Boston Google Scholar
  22. 22.
    Williams B., Santner T. and Notz W. (2000). Sequential design of computer experiments to minimize integrated response functions. Stat. Sin. 10: 1133–1152 Google Scholar
  23. 23.
    Yaglom A. (1986). Correlation Theory of Stationary and Related Random Functions I: Basic Results. Springer Series in Statistics. Springer-Verlag, New-York Google Scholar

Copyright information

© Springer Science+Business Media, LLC. 2008

Authors and Affiliations

  • Julien Villemonteix
    • 1
  • Emmanuel Vazquez
    • 2
  • Maryan Sidorkiewicz
    • 1
  • Eric Walter
    • 3
  1. 1.Renault S.A., Energy Systems DepartmentGuyancourtFrance
  2. 2.SupelecGif-sur-YvetteFrance
  3. 3.Laboratoire des Signaux et SystèmesCNRS-Supelec-Univ Paris-SudGif-sur-YvetteFrance

Personalised recommendations