Computational Geosciences

, Volume 17, Issue 4, pp 705–721 | Cite as

Reduced order models for many-query subsurface flow applications

  • George Shu Heng Pau
  • Yingqi Zhang
  • Stefan Finsterle
Original Paper

Abstract

Inverse modeling involves repeated evaluations of forward models, which can be computationally prohibitive for large numerical models. To reduce the overall computational burden of these simulations, we study the use of reduced order models (ROMs) as numerical surrogates. These ROMs usually involve using solutions to high-fidelity models at different sample points within the parameter space to construct an approximate solution at any point within the parameter space. This paper examines an input–output relational approach based on Gaussian process regression (GPR). We show that these ROMs are more accurate than the linear lookup tables with the same number of high-fidelity simulations. We describe an adaptive sampling procedure that automatically selects optimal sample points and demonstrate the use of GPR to a smooth response surface and a response surface with abrupt changes. We also describe how GPR can be used to construct ROMs for models with heterogeneous material properties. Finally, we demonstrate how the use of a GPR-based ROM in two many-query applications—uncertainty quantification and global sensitivity analysis—significantly reduces the total computational effort.

Keywords

Surrogate models Gaussian process regression Hydrogeology Uncertainty quantification 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Schmit, L.A., Farshi, B.: Some approximation concepts for structural synthesis. AIAA J. 12, 692–699 (1974)CrossRefGoogle Scholar
  2. 2.
    Barthelemy, J.F.M., Haftka, R.T.: Approximation concepts for optimum structural design—a review. Struct. Multidisc. Optim. 5, 129–144 (1993)CrossRefGoogle Scholar
  3. 3.
    Simpson, T., Peplinski, J., Koch, P., Allen, J.: Metamodels for computer-based engineering design: survey and recommendations. Eng. Comput. 17, 129–150 (2001)CrossRefGoogle Scholar
  4. 4.
    Lucia, D.J., Beran, P.S., Silva, W.A.: Reduced-order modeling: new approaches for computational physics. Progr. Aero. Sci. 40, 51–117 (2004)CrossRefGoogle Scholar
  5. 5.
    Saridakis, K.M., Dentsoras, A.J.: Soft computing in engineering design—a review. Adv. Eng. Informat. 22, 202–221 (2008)CrossRefGoogle Scholar
  6. 6.
    Forrester, A.I.J., Keane, A.J.: Recent advances in surrogate-based optimization. Progr. Aero. Sci. 45, 50–79 (2009)CrossRefGoogle Scholar
  7. 7.
    Razavi, S., Tolson, B.A., Burn, D.H.: Review of surrogate modeling in water resources. Water Resour. Res. 48, W07401 (2012)CrossRefGoogle Scholar
  8. 8.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)Google Scholar
  9. 9.
    Marrel, A., Iooss, B., Van Dorpe, F., Volkova, E.: An efficient methodology for modeling complex computer codes with Gaussian processes. Comput. Stat. Data Anal. 52, 4731–4744 (2008)CrossRefGoogle Scholar
  10. 10.
    Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Statist. Sci. 4, 409–435 (1989)CrossRefGoogle Scholar
  11. 11.
    Kleijnen, J.P.C.: Kriging metamodeling in simulation: a review. Eur. J. Oper. Res. 192, 707–716 (2009)CrossRefGoogle Scholar
  12. 12.
    Prudhomme, C., Rovas, D.V., Veroy, K., Machiels, L., Maday, Y., Patera, A.T., Turinici, G.: Reliable real-time solution of parametrized partial differential equations: reduced-basis output bound methods. J. Fluids Eng. 124, 70–80 (2002)CrossRefGoogle Scholar
  13. 13.
    Cardoso, M.A., Durlofsky, L.J., Sarma, P.: Development and application of reduced-order modeling procedures for subsurface flow simulation. Int. J. Numer. Meth. Engng. 77, 1322–1350 (2009)CrossRefGoogle Scholar
  14. 14.
    Lieberman, C., Willcox, K., Ghattas, O.: Parameter and state model reduction for large-scale statistical inverse problems. SIAM J. Sci. Comput. 32, 2523 (2010)CrossRefGoogle Scholar
  15. 15.
    Finsterle, S., Doughty, C., Kowalsky, M.B., Moridis, G.J., Pan, L., Xu, T., Zhang, Y., Pruess, K.: Advanced vadose zone simulations using TOUGH. Vadose Zone J. 7, 601 (2008)CrossRefGoogle Scholar
  16. 16.
    Cybenko, G.: Approximation by superpositions of a sigmoidal function. Math. Control Signals Syst. (MCSS) 5, 455–455 (1992)CrossRefGoogle Scholar
  17. 17.
    Zadeh, L.A.: Fuzzy logic. Comput. 21, 83–93 (1988)CrossRefGoogle Scholar
  18. 18.
    MacKay, D.J.C.: Information-based objective functions for active data selection. Neural Comput. 4, 590–604 (1992)CrossRefGoogle Scholar
  19. 19.
    Razavi, S., Tolson, B.A., Burn, D.H.: Numerical assessment of metamodelling strategies in computationally intensive optimization. Environ. Model Softw. 34, 67–86 (2012)CrossRefGoogle Scholar
  20. 20.
    Iooss, B., Boussouf, L., Feuillard, V., Marrel, A.: Numerical studies of the metamodel fitting and validation processes. Int. J. Advance. Syst. Meas. 3, 11–21 (2010)Google Scholar
  21. 21.
    Finsterle, S.: iTOUGH2 User’s Guide, pp. 1–137 (2007)Google Scholar
  22. 22.
    Marrel, A., Iooss, B., Laurent, B., Roustant, O.: Calculations of Sobol indices for the Gaussian process metamodel. Reliab. Eng. Syst. Saf. 94, 742–751 (2009)CrossRefGoogle Scholar
  23. 23.
    Hombal, V., Mahadevan, S.: Bias minimization in Gaussian process surrogate modeling for uncertainty quantification. Int. J. Uncert. Quantific. 1, 321–349 (2011)CrossRefGoogle Scholar
  24. 24.
    Rohmer, J., Foerster, E.: Global sensitivity analysis of large-scale numerical landslide models based on Gaussian-process meta-modeling. Comput. Geosci. 37, 917–927 (2011)CrossRefGoogle Scholar
  25. 25.
    Conti, S., O’Hagan, A.: Bayesian emulation of complex multi-output and dynamic computer models. J. Stat. Plann. Infer. 140, 640–651 (2009)CrossRefGoogle Scholar
  26. 26.
    Alvarez, M.A.: Kernels for vector-valued functions: a review. Foundations Trend. Mach. Learn. 4, 195–266 (2012)CrossRefGoogle Scholar
  27. 27.
    Higdon, D., Gattiker, J., Williams, B., Rightley, M.: Computer model calibration using high-dimensional output. J. Am. Stat. Assoc. 103, 570–583 (2008)CrossRefGoogle Scholar
  28. 28.
    Lawrence, N.D.: Gaussian process latent variable models for visualisation of high dimensional data. Adv. Neural. Inform. Process. Syst. 16, 329–336 (2004)Google Scholar
  29. 29.
    Bayarri, M.J., Berger, J.O., Cafeo, J., Garcia-Donato, G., Liu, F., Palomo, J., Parthasarathy, R.J., Paulo, R., Sacks, J., Walsh, D.: Computer model validation with functional output. Ann. Stat. 35, 1874–1906 (2007)CrossRefGoogle Scholar
  30. 30.
    Drignei, D., Forest, C.E., Nychka, D.: Parameter estimation for computationally intensive nonlinear regression with an application to climate modeling. Ann. Appl. Stat. 2, 1217–1230 (2008)CrossRefGoogle Scholar
  31. 31.
    Marrel, A., Iooss, B., Jullien, M., Laurent, B., Volkova, E.: Global sensitivity analysis for models with spatially dependent outputs. Environmetrics 22, 383–397 (2010)CrossRefGoogle Scholar
  32. 32.
    Wang, J., Fleet, D., Hertzmann, A.: Gaussian process dynamical models. Adv. Neural. Inform. Process. Syst. 18, 1441 (2006)Google Scholar
  33. 33.
    Neal, R.M.: Bayesian learning for neural networks. Springer, New York (1996)CrossRefGoogle Scholar
  34. 34.
    Rasmussen, C.E., Nickisch, H.: Gaussian processes for machine learning (GPML) toolbox. J. Mach. Learn. Res. 11, 3011–3015 (2010)Google Scholar
  35. 35.
    Carr, J.C., Beatson, R.K., Cherrie, J.B., Mitchell, T.J., Fright, W.R., McCallum, B.C., Evans, T.R.: Reconstruction and representation of 3D objects with radial basis functions. Proceedings of ACM SIGGRAPH (2001)Google Scholar
  36. 36.
    Bichon, B.J., Eldred, M.S., Swile, L.P., Mahadevan, S., Mcfarland, J.M.: Efficient global reliability analysis for nonlinear implicit performance functions. AIAA J. 46, 2459–2468 (2008)CrossRefGoogle Scholar
  37. 37.
    Bui-Thanh, T., Ghattas, O., Higdon, D.: Adaptive Hessian-based non-stationary Gaussian process response surface method for probability density approximation with application to Bayesian solution of large-scale inverse problems. SIAM J. Sci. Comput. 34, A2837—A2871 (2012)CrossRefGoogle Scholar
  38. 38.
    Gramacy, R.B., Lee, H.: Adaptive design and analysis of supercomputer experiments. Technometrics 51(2), 130–142 (2009)CrossRefGoogle Scholar
  39. 39.
    Janusevskis, J., Le, R.R., Ginsbourger, D.: Parallel expected improvements for global optimization: summary, bounds and speed-up. OMD2 deliverable Nmbr. 2.1.1-B (2011)Google Scholar
  40. 40.
    Finsterle, S., Pruess, K.: Solving the estimation-identification problem in two-phase flow modeling. Water Resour. Res. 31, 913–924 (1995)CrossRefGoogle Scholar
  41. 41.
    Pruess, K., Moridis, G., Oldenburg, C.: TOUGH2 user’s guide, version 2.0 (1999)Google Scholar
  42. 42.
    Paciorek, C., Schervish, M.: Nonstationary covariance functions for Gaussian process regression. Adv. Neural. Inform. Process. Syst. 16, 273–280 (2004)Google Scholar
  43. 43.
    Homma, T., Saltelli, A.: Importance measures in global sensitivity analysis of nonlinear models. Reliab. Eng. Syst. Saf. 52, 1–17 (1996)CrossRefGoogle Scholar
  44. 44.
    Saltelli, A., Ratto, M., Andres, T., Campolongo, F., Cariboni, J., Gatelli, D., Saisana, M., Tarantola, S.: Global sensitivity analysis: the primer. Wiley, Chichester (2008)Google Scholar
  45. 45.
    Archer, G., Saltelli, A., Sobol, I.M.: Sensitivity measures, ANOVA-like techniques and the use of bootstrap. J. Stat. Comput. Simulat. 58, 99–120 (1997)CrossRefGoogle Scholar
  46. 46.
    Rozza, G., Huynh, D.B.P., Patera, A.T.: Reduced basis approximation and a posteriori error estimation for affinely parametrized elliptic coercive partial differential equations. Arch. Computat. Methods Eng. 15, 229–275 (2008)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht (outside the USA) 2013

Authors and Affiliations

  • George Shu Heng Pau
    • 1
  • Yingqi Zhang
    • 1
  • Stefan Finsterle
    • 1
  1. 1.Lawrence Berkeley National Laboratory 1BerkeleyUSA

Personalised recommendations