Advertisement

Bayesian Uncertainty Propagation Using Gaussian Processes

  • Ilias Bilionis
  • Nicholas Zabaras
Living reference work entry

Abstract

Classic non-intrusive uncertainty propagation techniques, typically, require a significant number of model evaluations in order to yield convergent statistics. In practice, however, the computational complexity of the underlying computer codes limits significantly the number of observations that one can actually make. In such situations the estimates produced by classic approaches cannot be trusted since the limited number of observations induces additional epistemic uncertainty. The goal of this chapter is to highlight how the Bayesian formalism can quantify this epistemic uncertainty and provide robust predictive intervals for the statistics of interest with as few simulations as one has available. It is shown how the Bayesian formalism can be materialized by employing the concept of a Gaussian process (GP). In addition, several practical aspects that depend on the nature of the underlying response surface, such as the treatment of spatiotemporal variation, and multi-output responses are discussed. The practicality of the approach is demonstrated by propagating uncertainty through a dynamical system and an elliptic partial differential equation.

Keywords

Epistemic uncertainty Expensive computer code Expensive computer simulations Gaussian process Uncertainty propagation 

References

  1. 1.
    Aarnes, J.E., Kippe, V., Lie, K.A., Rustad, A.B.: Modelling of multiscale structures in flow simulations for petroleum reservoirs. In: Hasle, G., Lie, K.A., Quak, E. (eds.): Geometric Modelling, Numerical Simulation, and Optimization, chap. 10, pp. 307–360. Springer, Berlin/Heidelberg (2007). doi:10.1007/978-3-540-68783-2_10Google Scholar
  2. 2.
    Alvarez, M., Lawrence, N.D.: Sparse convolved Gaussian processes for multi-output regression. In: Koller, D., Schuurmans, D., Bengio, Y., and Bottou. L. (eds.): Advances in Neural Information Processing Systems 21 (NIPS 2008), Vancouver, B.C., Canada (2008)Google Scholar
  3. 3.
    Alvarez, M., Luengo-Garcia, D., Titsias, M., Lawrence, N.: Efficient multioutput Gaussian processes through variational inducing kernels. In: Ft. Lauderdale, FL, USA (2011)Google Scholar
  4. 4.
    Babuska, I., Nobile, F., Tempone, R.: A stochastic collocation method for elliptic partial differential equations with random input data. SIAM J. Numer. Anal. 45(3), 1005–1034 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Betz, W., Papaioannou, I., Straub, D.: Numerical methods for the discretization of random fields by means of the Karhunen-Loeve expansion. Comput. Methods Appl. Mech. Eng. 271, 109–129 (2014). doi:10.1016/j.cma.2013.12.010MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Bilionis, I.: py-orthpol: Construct orthogonal polynomials in python. https://github.com/PredictiveScienceLab/py-orthpol (2013)
  7. 7.
    Bilionis, I., Zabaras, N.: Multi-output local Gaussian process regression: applications to uncertainty quantification. J. Comput. Phys. 231(17), 5718–5746 (2012) doi:10.1016/J.Jcp.2012.04.047MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Bilionis, I., Zabaras, N.: Multidimensional adaptive relevance vector machines for uncertainty quantification. SIAM J. Sci. Comput. 34(6), B881–B908 (2012). doi:10.1137/120861345MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Bilionis, I., Zabaras, N.: Solution of inverse problems with limited forward solver evaluations: a Bayssian perspective. Inverse Probl. 30(1), Artn 015004 (2014). doi:10.1088/0266-5611/30/1/015004Google Scholar
  10. 10.
    Bilionis, I., Zabaras, N., Konomi, B.A., Lin, G.: Multi-output separable Gaussian process: towards an efficient, fully Bayesian paradigm for uncertainty quantification. J. Comput. Phys. 241, 212–239 (2013). doi:10.1016/J.Jcp.2013.01.011CrossRefGoogle Scholar
  11. 11.
    Bilionis, I., Drewniak, B.A., Constantinescu, E.M.: Crop physiology calibration in the CLM. Geoscientific Model Dev. 8(4), 1071–1083 (2015). doi:10.5194/gmd-8-1071-2015, http://www.geosci-model-dev.net/8/1071/2015 http://www.geosci-model-dev.net/8/1071/2015/gmd-8-1071-2015.pdf, gMD http://www.geosci-model-dev.net/8/1071/2015/gmd-8-1071-2015.pdf
  12. 12.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Information Science and Statistics. Springer, New York (2006)zbMATHGoogle Scholar
  13. 13.
    Boyle, P., Frean, M.: Dependent Gaussian processes. In: Saul, L.K., Weiss, Y., and Bottou L. (eds.): Advances in Neural Information Processing Systems 17 (NIPS 2004), Whistler, B.C., Canada (2004)Google Scholar
  14. 14.
    Chen, P., Zabaras, N., Bilionis, I.: Uncertainty propagation using infinite mixture of Gaussian processes and variational Bayssian inference. J. Comput. Phys. 284, 291–333 (2015)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Conti, S., O’Hagan, A.: Bayesian emulation of complex multi-output and dynamic computer models. J. Stat. Plan. Inference 140(3), 640–651 (2010). doi:10.1016/J.Jspi.2009.08.006MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Currin, C., Mitchell, T., Morris, M., Ylvisaker, D.: A Bayesian approach to the design and analysis of computer experiments. Report, Oak Ridge Laboratory (1988)CrossRefGoogle Scholar
  17. 17.
    Currin, C., Mitchell, T., Morris, M., Ylvisaker, D.: Bayesian prediction of deterministic functions, with applications to the design and analysis of computer experiments. J. Am. Stat. Assoc. 86(416), 953–963 (1991). doi:10.2307/2290511MathSciNetCrossRefGoogle Scholar
  18. 18.
    Dawid, A.P.: Some matrix-variate distribution theory – notational considerations and a Bayesian application. Biometrika 68(1), 265–274 (1981)MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Del Moral, P., Doucet, A., Jasra, A.: Sequential Monte Carlo samplers. J. R. Stat. Soc. Ser. B. (Stat. Methodol.) 68(3), 411–436 (2006)Google Scholar
  20. 20.
    Delves, L.M., Walsh, J.E., of Manchester Department of Mathematics, U., of Computational LUD, Science, S.: Numerical Solution of Integral Equations. Clarendon Press, Oxford (1974)Google Scholar
  21. 21.
    Doucet, A., De Freitas, N., Gordon, N. (eds.): Sequential Monte Carlo Methods in Practice (Statistics for Engineering and Information Science). Springer, New York (2001)Google Scholar
  22. 22.
    Durrande, N., Ginsbourger, D., Roustant, O.: Additive covariance kernels for high-dimensional Gaussian process modeling. arXiv:11116233 (2011)Google Scholar
  23. 23.
    Duvenaud, D., Nickisch, H., Rasmussen, C.E.: Additive Gaussian processes. In: Advances in Neural Information Processing Systems, vol. 24, pp. 226–234 (2011)Google Scholar
  24. 24.
    Gautschi, W.: On generating orthogonal polynomials. SIAM J. Sci. Stat. Comput. 3(3), 289–317 (1982). doi:10.1137/0903018MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    Gautschi, W.: Algorithm-726 – ORTHPOL – a package of routines for generating orthogonal polynomials and Gauss-type quadrature rules. ACM Trans. Math. Softw. 20(1), 21–62 (1994) doi:10.1145/174603.174605CrossRefzbMATHGoogle Scholar
  26. 26.
    Ghanem, R., Spanos, P.D.: Stochastic Finite Elements: A Spectral Approach, rev. edn. Dover Publications, Minneola (2003)Google Scholar
  27. 27.
    Gramacy, R.B., Lee, H.K.H.: Cases for the nugget in modeling computer experiments. Stat. Comput. 22(3), 713–722 (2012) doi:10.1007/s11222-010-9224-xMathSciNetCrossRefzbMATHGoogle Scholar
  28. 28.
    Haff, L.: An identity for the Wishart distribution with applications. J. Multivar. Anal. 9(4), 531–544 (1979). doi:http://dx.doi.org/10.1016/0047-259X(79)90056-3Google Scholar
  29. 29.
    Hastings, W.K.: Monte-Carlo sampling methods using Markov chains and their applications. Biometrika 57(1), 97–109 (1970). doi:10.2307/2334940MathSciNetCrossRefzbMATHGoogle Scholar
  30. 30.
    Higdon, D., Gattiker, J., Williams, B., Rightley, M.: Computer model calibration using high-dimensional output. J. Am. Stat. Assoc. 103(482), 570–583 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  31. 31.
    Liu, J.S.: Monte Carlo Strategies in Scientific Computing. Springer Series in Statistics. Springer, New York (2001)zbMATHGoogle Scholar
  32. 32.
    Loève, M.: Probability Theory, 4th edn. Graduate Texts in Mathematics. Springer, New York (1977)zbMATHGoogle Scholar
  33. 33.
    Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H., Teller, E.: Equation of state calculations by fast computing machines. J. Chem. Phys. 21(6), 1087–1092 (1953). doi:10.1063/1.1699114CrossRefGoogle Scholar
  34. 34.
    Oakley, J., O’Hagan, A.: Bayesian inference for the uncertainty distribution of computer model outputs. Biometrika 89(4), 769–784 (2002)CrossRefGoogle Scholar
  35. 35.
    Oakley, J.E., O’Hagan, A.: Probabilistic sensitivity analysis of complex models: a Bayesian approach. J. R. Stat. Soc. Ser. B Stat. Methodol. 66, 751–769 (2004). doi:10.1111/j.1467-9868.2004.05304.xMathSciNetCrossRefzbMATHGoogle Scholar
  36. 36.
    O’Hagan, A.: Bayes-Hermite quadrature. J. Stat. Plan. Inference 29(3), 245–260 (1991)MathSciNetCrossRefzbMATHGoogle Scholar
  37. 37.
    O’Hagan, A., Kennedy, M.: Gaussian emulation machine for sensitivity analysis (GEM-SA) (2015). http://www.tonyohagan.co.uk/academic/GEM/ Google Scholar
  38. 38.
    O’Hagan, A., Kennedy, M.C., Oakley, J.E.: Uncertainty analysis and other inference tools for complex computer codes. Bayesian Stat. 6, 503–524 (1999)MathSciNetzbMATHGoogle Scholar
  39. 39.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)zbMATHGoogle Scholar
  40. 40.
    Reinhardt, H.J.: Analysis of Approximation Methods for Differential and Integral Equations. Applied Mathematical Sciences. Springer, New York (1985)CrossRefzbMATHGoogle Scholar
  41. 41.
    Robert, C.P., Casella, G.: Monte Carlo Statistical Methods, 2nd edn. Springer Texts in Statistics. Springer, New York (2004)CrossRefzbMATHGoogle Scholar
  42. 42.
    Sacks, J., Welch, W.J., Mitchell, T., Wynn, H.P.: Design and analysis of computer experiments. Stat. Sci. 4(4), 409–423 (1989)MathSciNetCrossRefzbMATHGoogle Scholar
  43. 43.
    Seeger, M.: Low rank updates for the Cholesky decomposition. Report, University of California at Berkeley (2007)Google Scholar
  44. 44.
    Smolyak, S.A.: Quadrature and interpolation formulas for tensor products of certain classes of functions. Sov. Math. Dokl. 4, 240–243 (1963)zbMATHGoogle Scholar
  45. 45.
    Stark, H., Woods, J.W., Stark, H.: Probability and Random Processes with Applications to Signal Processing, 3rd edn. Prentice Hall, Upper Saddle River (2002)Google Scholar
  46. 46.
    Stegle, O., Lippert, C., Mooij, J.M., Lawrence, N.D., Borgwardt, K.M.: Efficient inference in matrix-variate Gaussian models with backslash iid observation noise. In: Shawe-Taylor, J., Zemel, R.S., Barlett, P.L., Pereira, F., Weinberger K.Q. (eds.): Advances in Neural Information Processing Systems 24 (NIPS 2011), Granada, Spain (2011)Google Scholar
  47. 47.
    Van Loan, C.F.: The ubiquitous Kronecker product. J. Comput. Appl. Math. 123(1–2), 85–100 (2000)Google Scholar
  48. 48.
    Wan, J., Zabaras, N.: A Bayssian approach to multiscale inverse problems using the sequential Monte Carlo method. Inverse Probl. 27(10), 105004 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  49. 49.
    Wan, X.L., Karniadakis, G.E.: An adaptive multi-element generalized polynomial chaos method for stochastic differential equations. J. Comput. Phys. 209(2), 617–642 (2005). doi:10.1016/j.jcp.2005.03.023, <GotoISI>://WOS:000230736700011 Google Scholar
  50. 50.
    Welch, W.J., Buck, R.J., Sacks, J., Wynn, H.P., Mitchell, T.J., Morris, M.D.: Screening, predicting, and computer experiments. Technometrics 34(1), 15–25 (1992)CrossRefGoogle Scholar
  51. 51.
    Xiu, D.B.: Efficient collocational approach for parametric uncertainty analysis. Commun. Comput. Phys. 2(2), 293–309 (2007)MathSciNetzbMATHGoogle Scholar
  52. 52.
    Xiu, D.B., Hesthaven, J.S.: High-order collocation methods for differential equations with random inputs. SIAM J. Sci. Comput. 27(3), 1118–1139 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  53. 53.
    Xiu, D.B., Karniadakis, G.E.: The wiener-askey polynomial chaos for stochastic differential equations. SIAM J. Sci. Comput. 24(2), 619–644 (2002)MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.School of Mechanical EngineeringPurdue UniversityWest LafayetteUSA
  2. 2.Warwick Centre for Predictive ModellingUniversity of WarwickCoventryUK

Personalised recommendations