Skip to main content

Numerical Issues in Maximum Likelihood Parameter Estimation for Gaussian Process Interpolation

  • Conference paper
  • First Online:
Machine Learning, Optimization, and Data Science (LOD 2021)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 13164))

Abstract

This article investigates the origin of numerical issues in maximum likelihood parameter estimation for Gaussian process (GP) interpolation and investigates simple but effective strategies for improving commonly used open-source software implementations. This work targets a basic problem but a host of studies, particularly in the literature of Bayesian optimization, rely on off-the-shelf GP implementations. For the conclusions of these studies to be reliable and reproducible, robust GP implementations are critical.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Code available at https://github.com/saferGPMLE.

References

  • Ahmed, A.R.H.A.: Studies on metaheuristics for continuous global optimization problems. Ph.D. thesis, Kyoto University (2004)

    Google Scholar 

  • Andrianakis, I., Challenor, P.G.: The effect of the nugget on Gaussian process emulators of computer models. Comput. Stat. Data Anal. 56(12), 4215–4228 (2012)

    Article  MathSciNet  Google Scholar 

  • Baudin, M., Dutfoy, A., Iooss, B., Popelin, A.L.: OpenTURNS: an industrial software for uncertainty quantification in simulation. In: Ghanem, R., Higdon, D., Owhadi, H. (eds.) Handbook of Uncertainty Quantification, pp. 2001–2038. Springer, Switzerland (2017)

    Chapter  Google Scholar 

  • Bect, J., Vazquez, E., et al.: STK: a small (Matlab/Octave) toolbox for Kriging. Release 2.6 (2011–2021). http://kriging.sourceforge.net

  • Byrd, R., Lu, P., Nocedal, J., Zhu, C.: A limited memory algorithm for bound constrained optimization. SIAM J. Sci. Comput. 16(5), 1190–1208 (1995)

    Article  MathSciNet  Google Scholar 

  • Chafekar, D., Xuan, J., Rasheed, K.: Constrained multi-objective optimization using steady state genetic algorithms. In: Cantú-Paz, E., et al. (eds.) GECCO 2003. LNCS, vol. 2723, pp. 813–824. Springer, Heidelberg (2003). https://doi.org/10.1007/3-540-45105-6_95

    Chapter  MATH  Google Scholar 

  • Curtis, F.E., Que, X.: A Quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees. Math. Program. Comput. 7(4), 399–428 (2015). https://doi.org/10.1007/s12532-015-0086-2

    Article  MathSciNet  MATH  Google Scholar 

  • Deutsch, J.L., Deutsch, C.V.: Latin hypercube sampling with multidimensional uniformity. J. Stat. Plann. Inference 142(3), 763–772 (2012)

    Article  MathSciNet  Google Scholar 

  • Emmerich, M.T.M., Giannakoglou, K.C., Naujoks, B.: Single- and multiobjective evolutionary optimization assisted by Gaussian random field metamodels. IEEE Trans. Evol. Comput. 10(4), 421–439 (2006)

    Article  Google Scholar 

  • Erickson, C.B., Ankenman, B.E., Sanchez, S.M.: Comparison of Gaussian process modeling software. Eur. J. Oper. Res. 266(1), 179–192 (2018)

    Article  MathSciNet  Google Scholar 

  • Feliot, P.: Une approche bayésienne pour l’optimisation multi-objectif sous contrainte. Ph.D. thesis, University of Paris-Saclay (2017)

    Google Scholar 

  • Gardner, J.R., Pleiss, G., Bindel, D., Weinberger, K.Q., Wilson, A.G.: GPyTorch: blackbox matrix-matrix Gaussian process inference with GPU acceleration. In: Advances in Neural Information Processing Systems, vol. 31. Curran Associates (2018)

    Google Scholar 

  • Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)

    Article  MathSciNet  Google Scholar 

  • Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: Bengio, Y., LeCun, Y. (eds.) 3rd International Conference on Learning Representations, ICLR 2015, San Diego, USA (2015)

    Google Scholar 

  • de G. Matthews, A.G., et al.: GPflow: a Gaussian process library using TensorFlow. J. Mach. Learn. Res., 18(40), 1–6 (2017)

    Google Scholar 

  • Mockus, J.: On Bayesian methods for seeking the extremum. In: Marchuk, G.I. (ed.) Optimization Techniques IFIP Technical Conference Novosibirsk. July 1–7, 1974, pp. 400–404. Springer, Heidelberg (1975)

    Chapter  Google Scholar 

  • Nash, S.G.: Newton-type minimization via the Lanczos method. SIAM J. Numer. Anal. 21(4), 770–788 (1984)

    Article  MathSciNet  Google Scholar 

  • Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2006). https://doi.org/10.1007/978-0-387-40065-5

  • O’Hagan, A.: Curve fitting and optimal design for prediction. J. R. Stat. Soc. B 40, 1–24 (1978)

    MathSciNet  MATH  Google Scholar 

  • Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)

    MathSciNet  MATH  Google Scholar 

  • Petit, S., Bect, J., Da Veiga, S., Feliot, P., Vazquez, E.: Towards new cross-validation-based estimators for Gaussian process regression: efficient adjoint computation of gradients. arXiv:2002.11543 (2020)

  • Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes in C. The Art of Scientific Computing. Cambridge University Press, New York (1992)

    Google Scholar 

  • Rasmussen, C.E., Nickisch, H.: Gaussian processes for machine learning (GPML) toolbox. J. Mach. Learn. Res. 11, 3011–3015 (2010)

    MathSciNet  MATH  Google Scholar 

  • Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)

    MATH  Google Scholar 

  • Roustant, O., Ginsbourger, D., Deville, Y.: DiceKriging, DiceOptim: two R packages for the analysis of computer experiments by Kriging-based metamodeling and optimization. J. Stat. Software 51(1), 1–55 (2012)

    Article  Google Scholar 

  • Santner, T.J., Williams, B.J., Notz, W.I.: The Design and Analysis of Computer Experiments. SSS, Springer, New York (2003). https://doi.org/10.1007/978-1-4757-3799-8

    Book  MATH  Google Scholar 

  • Sheffield machine learning group. GPy: a Gaussian process framework in Python, version 1.9.9 (2012–2020). http://github.com/SheffieldML/ GPy

  • Snoek, J., Larochelle, H., Adams, R.P.: Practical Bayesian optimization of machine learning algorithms. In: 25th International Conference on Neural Information Processing Systems, vol. 2, pp. 2951–2959. Curran Associates Inc. (2012)

    Google Scholar 

  • Srinivas, N., Krause, A., Kakade, S., Seeger, M.: Gaussian process optimization in the bandit setting: no regret and experimental design. In: 27th International Conference on Machine Learning (ICML), pp. 1015–1022 (2010)

    Google Scholar 

  • Stein, M.L.: Interpolation of Spatial Data: Some Theory for Kriging. Springer, New York (1999). https://doi.org/10.1007/978-1-4612-1494-6

    Book  MATH  Google Scholar 

  • Surjanovic, S., Bingham, D.: Virtual library of simulation experiments: test functions and datasets (2013). http://www.sfu.ca/~ssurjano/branin.html. Accessed 13 Oct 2020

  • Trefethen, L.N., Bau, D.: Numerical Linear Algebra. SIAM (1997)

    Google Scholar 

  • Vanhatalo, J., Riihimäki, J., Hartikainen, J., Jylänki, P., Tolvanen, V., Vehtari, A.: Bayesian modeling with Gaussian processes using the MATLAB toolbox GPstuff (v3.3). CoRR, abs/1206.5754 (2012)

    Google Scholar 

  • Wendland, H.: Scattered Data Approximation. Cambridge Monographs on Applied and Computational Mathematics, Cambridge University press, Cambridge (2004)

    Book  Google Scholar 

  • Worley, B.A.: Deterministic uncertainty analysis. Technical report, ORNL-6428, Oak Ridge National Laboratory, TN, USA (1987)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Emmanuel Vazquez .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Basak, S., Petit, S., Bect, J., Vazquez, E. (2022). Numerical Issues in Maximum Likelihood Parameter Estimation for Gaussian Process Interpolation. In: Nicosia, G., et al. Machine Learning, Optimization, and Data Science. LOD 2021. Lecture Notes in Computer Science(), vol 13164. Springer, Cham. https://doi.org/10.1007/978-3-030-95470-3_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-95470-3_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-95469-7

  • Online ISBN: 978-3-030-95470-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics