Advertisement

Bayesian Optimization Using Sequential Monte Carlo

  • Romain Benassi
  • Julien Bect
  • Emmanuel Vazquez
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7219)

Abstract

We consider the problem of optimizing a real-valued continuous function f using a Bayesian approach, where the evaluations of f are chosen sequentially by combining prior information about f, which is described by a random process model, and past evaluation results. The main difficulty with this approach is to be able to compute the posterior distributions of quantities of interest which are used to choose evaluation points. In this article, we decide to use a Sequential Monte Carlo (SMC) approach.

Keywords

Sequential Monte Carlo Expect Improvement Gaussian Process Model Reference Algorithm Bayesian Optimization 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Mockus, J., Tiesis, V., Zilinskas, A.: The application of Bayesian methods for seeking the extremum. In: Dixon, L., Szego, G. (eds.) Towards Global Optimization, vol. 2, pp. 117–129. Elsevier (1978)Google Scholar
  2. 2.
    Santner, T.J., Williams, B.J., Notz, W.I.: The design and analysis of computer experiments. Springer (2003)Google Scholar
  3. 3.
    Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)zbMATHMathSciNetCrossRefGoogle Scholar
  4. 4.
    Benassi, R., Bect, J., Vazquez, E.: Robust Gaussian Process-Based Global Optimization Using a Fully Bayesian Expected Improvement Criterion. In: Coello Coello, C.A. (ed.) LION 5. LNCS, vol. 6683, pp. 176–190. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  5. 5.
    Gramacy, R., Polson, N.: Particle learning of Gaussian process models for sequential design and optimization. J. Comput. Graph. Stat. 20(1), 102–118 (2011)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Lizotte, D.J., Greiner, R., Schuurmans, D.: An experimental methodology for response surface optimization methods. J. Global Optim., 38 pages (2011)Google Scholar
  7. 7.
    Bardenet, R., Kégl, B.: Surrogating the surrogate: accelerating Gaussian-process-based global optimization with a mixture cross-entropy algorithm. In: ICML 2010, Proceedings, Haifa, Israel (2010)Google Scholar
  8. 8.
    Chopin, N.: A sequential particle filter method for static models. Biometrika 89(3), 539–552 (2002)zbMATHMathSciNetCrossRefGoogle Scholar
  9. 9.
    Del Moral, P., Doucet, A., Jasra, A.: Sequential Monte Carlo samplers. J. R. Stat. Soc. B 68(3), 411–436 (2006)zbMATHCrossRefGoogle Scholar
  10. 10.
    Ginsbourger, D., Roustant, O.: DiceOptim: Kriging-based optimization for computer experiments, R package version 1.2 (2011)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Romain Benassi
    • 1
  • Julien Bect
    • 1
  • Emmanuel Vazquez
    • 1
  1. 1.SUPELECFrance

Personalised recommendations