Journal of Global Optimization

, Volume 56, Issue 4, pp 1773–1790 | Cite as

Examples of inconsistency in optimization by expected improvement

Article

Abstract

We consider the 1D Expected Improvement optimization based on Gaussian processes having spectral densities converging to zero faster than exponentially. We give examples of problems where the optimization trajectory is not dense in the design space. In particular, we prove that for Gaussian kernels there exist smooth objective functions for which the optimization does not converge on the optimum.

Keywords

Bayesian optimization Expected improvement Gaussian covariance function Analytic kernel 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bull A.D.: Convergence rates of efficient global optimization algorithms. J. Mach. Learn. Res. 12, 2879–2904 (2011)Google Scholar
  2. 2.
    Forrester A.I.J., Sóbester A, Keane A.J.: Engineering Design via Surrogate Modelling: A Practical Guide. Wiley, London (2008)CrossRefGoogle Scholar
  3. 3.
    Gutmann H.-M.: A radial basis function method for global optimization. J. Glob. Optim. 19, 201–227 (2001)CrossRefGoogle Scholar
  4. 4.
    Jones D.R.: A taxonomy of global optimization methods based on response surfaces. J. Glob. Optim. 21, 345–383 (2001)CrossRefGoogle Scholar
  5. 5.
    Jones, D.R., Schonlau, M., Welch, W.J.: A data analytic approach to Bayesian global optimization. In: Proceedings of the ASA, Section on Physical and Engineering Sciences, pp. 186–191 (1997)Google Scholar
  6. 6.
    Jones D.R., Schonlau M., Welch W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13, 455–492 (1998)CrossRefGoogle Scholar
  7. 7.
    Katznelson Y.: An Introduction to Harmonic Analysis. Cambridge University Press, Cambridge (2004)CrossRefGoogle Scholar
  8. 8.
    Kincaid D., Cheney E.: Numerical Analysis: Mathematics of Scientific Computing. American Mathematical Society, Providence, RI (2002)Google Scholar
  9. 9.
    Locatelli M.: Bayesian algorithms for one-dimensional global optimization. J. Glob. Optim. 10, 57–76 (1997)CrossRefGoogle Scholar
  10. 10.
    Mockus J.: Bayesian Approach to Global Optimization: Theory and Applications. Mathematics and its Applications: Soviet Series. Kluwer Academic, Dordrecht (1989)Google Scholar
  11. 11.
  12. 12.
    Mockus J., Tiesis V., Zilinskas A.: The application of bayesian methods for seeking the extremum. Towards Glob. Optim. 2, 117–129 (1978)Google Scholar
  13. 13.
    Rasmussen C.E., Williams C.K.I.: Gaussian Processes for Machine Learning. The MIT Press, Cambridge, MA (2006)Google Scholar
  14. 14.
    Schonlau, M.: Computer Experiments and Global Optimization. Ph.D. thesis, Waterloo, ON, Canada (1997)Google Scholar
  15. 15.
    Schonlau, M., Welch, W.J.: Global optimization with nonparametric function fitting. In: Proceedings of the ASA, Section on Physical and Engineering Sciences, pp. 183–186 (1996)Google Scholar
  16. 16.
    Torn A., Zilinskas A.: Global optimization. Springer, New York (1989)CrossRefGoogle Scholar
  17. 17.
    Vazquez, E., Bect, J.: Pointwise consistency of the kriging predictor with known mean and covariance functions. In: mODa 9—Advances in Model-oriented Design and Analysis. Bertinoro, Italy, 14–19 June 2010Google Scholar
  18. 18.
    Vazquez E., Bect J.: Convergence properties of the expected improvement algorithm with fixed mean and covariance functions. J. Stat. Plan. Inference 140(11), 3088–3095 (2010)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC. 2012

Authors and Affiliations

  1. 1.Datadvance llcMoscowRussia
  2. 2.Institute for Information Transmission ProblemsMoscowRussia

Personalised recommendations