Skip to main content
Log in

Gaussian Processes for history-matching: application to an unconventional gas reservoir

  • Original Paper
  • Published:
Computational Geosciences Aims and scope Submit manuscript

Abstract

The process of reservoir history-matching is a costly task. Many available history-matching algorithms either fail to perform such a task or they require a large number of simulation runs. To overcome such struggles, we apply the Gaussian Process (GP) modeling technique to approximate the costly objective functions and to expedite finding the global optima. A GP model is a proxy, which is employed to model the input-output relationships by assuming a multi-Gaussian distribution on the output values. An infill criterion is used in conjunction with a GP model to help sequentially add the samples with potentially lower outputs. The IC fault model is used to compare the efficiency of GP-based optimization method with other typical optimization methods for minimizing the objective function. In this paper, we present the applicability of using a GP modeling approach for reservoir history-matching problems, which is exemplified by numerical analysis of production data from a horizontal multi-stage fractured tight gas condensate well. The results for the case that is studied here show a quick convergence to the lowest objective values in less than 100 simulations for this 20-dimensional problem. This amounts to an almost 10 times faster performance compared to the Differential Evolution (DE) algorithm that is also known to be a powerful optimization technique. The sensitivities are conducted to explain the performance of the GP-based optimization technique with various correlation functions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Abbreviations

1D:

One-dimensional

A:

Non-dominated region

Bar sign “¯”:

Average value

CCE:

Constant Composition Experiment

c(x,x’):

Kernel or covariance function between two location x and x’

CGR:

Condensate Gas Gatio

d :

The dimension of problem

det[C]:

Determinant of covariance matrix C

D n :

Training data set with n samples

D n+1 :

Augmented data set with n + 1 samples

DE:

Differential Evolution

DFIT:

Diagnostic Fracture Injection Test

EI:

Expected Improvement

EnKF:

Ensemble Kalman Filter

ES:

Ensemble Smoother

ES-MDA:

Ensemble Smoother for Multiple Data Assimilation

F:

Degree Fahrenheit

f :

The output of the truth function

f :

A vector containing the output of the truth function in several locations

F :

The output of the truth function as a random variable

F :

The output of the truth function in several location as a random vector

GP:

Gaussian Process

h :

Reservoir thickness, ft

k × w :

Fracture conductivity, md ft

k f :

Current fracture permeability, md

k i :

Original (initial) fracture permeability, md

KB:

Kelly bushing

l :

Lateral length, ft

Ln(L):

Negative concentrated log-likelihood

M :

Misfit function

n :

The number of available samples

(simulations)

p :

Current pressure, psi

p i :

Initial pressure, psi

pr:

Probability distribution

PR-EOS:

Peng-Robinson Equation of State

q :

Production flow rate, bbl/day (liquid) or MMscf (gas)

Q wr :

Remaining water in the reservoir after injection, ft 3

r :

The correlation vector between sample

the x* and the data D

R :

Covariance matrix = σ 2 C

S 2(x*):

The variance of predicted value y* corresponding to sample x* by GP

SRV:

Stimulated Reservoir Volume

S winit :

Initial water saturation in the model

S wSRV :

Initial water saturation in the SRV

SRV:

Stimulated Reservoir Volume

TVD:

True Vertical Depth

x i :

A sample i

x f :

Fracture half length, ft

Y :

The posterior distribution of the modeled objective

ŷ :

The predictive mean of the predictive GP

w :

Fracture width, ft

W SRV :

The width of a 1D SRV,

b :

The current best member

n +1:

The augmented training data set by adding

a new sample

o,g,w :

oil, gas, water

obs :

Observed data

sim :

Simulation data

γ :

Fracture reduction factor

δ :

Molar composition of components in

oil or gas

𝜃 :

GP hyperparameters (length scales) of dimension d

λ :

An anisotropic distance measure

\(\hat {\mu }\) :

The estimated mean of the GP model

knowing data

μ :

The prior mean of the GP model

ν :

A constant used in defining Matérn

correlation function

\(\hat {\sigma }^{\mathrm {2}}\) :

The estimated variance of the GP model knowing the data

φ :

Porosity

ψ :

The normal cumulative distribution function

ψ s :

The standard normal cumulative distribution function

ϕ :

The normal probability density function

ϕ s :

The standard normal probability density function

References

  1. Evensen, G.: Data assimilation-the ensemble Kalman filter, 2nd edn, vol. XXIV, p 308. Springer (2009)

  2. Skjervheim, J.-A., Evensen, G.: An Ensemble Smoother for Assisted History Matching. SPE-141929-MS, Paper Presented at the SPE Reservoir Simulation Symposium, The Woodlands, Texas, USA (2011)

  3. Leeuwen, P.J.v., Evensen, G.: Data assimilation and inverse methods in terms of a probabilistic formulation. Mon. Weather Rev. 124(12), 2898–2913 (1996). doi:10.1175/1520-0493(1996)1242898:DAAIMI2.0.CO;2

  4. Chen, Y., Oliver, D.S.: Ensemble randomized maximum likelihood method as an iterative ensemble smoother. Math Geosci 44(1), 1–26 (2012). doi:10.1007/s11004-011-9376-z

  5. Emerick, A.A., Reynolds, A.C.: Ensemble smoother with multiple data assimilation. Comput. Geosci. 55, 3–15 (2013). doi:10.1016/j.cageo.2012.03.011

  6. Chen, Y., Oliver, D.S., Zhang, D.: Efficient ensemble-based closed-loop production optimization. SPE J. 14(4), 634–645 (2009). doi:10.2118/112873-PA

  7. Christie, M., Cliffe, A., Dawid, P., Senn, S.S.: Simplicity, complexity and modelling, p. 220. Wiley (2011)

  8. Oliver, D., Chen, Y.: Recent progress on reservoir history matching: a review. Comput Geosci 15(1), 185–221 (2011). doi:10.1007/s10596-010-9194-2

  9. Bard, Y.: Nonlinear parameter estimation, p 341. Academic Press, NY (1974)

    Google Scholar 

  10. Gill, P.E., Murray, W., Wright, M.H.: Practical optimization, p. 401. Academic Press (1981)

  11. Regis, R.G., Shoemaker, C.A.: Combining radial basis function surrogates and dynamic coordinate search in high-dimensional expensive black-box optimization. Eng. Optim. 45(5), 529–555 (2012). doi:10.1080/0305215X.2012.687731

  12. Ciaurri, D., Mukerji, T., Durlofsky, L.: Derivative-Free Optimization for Oil Field Operations. In: Yang, X.-S., Koziel, S. (eds.) Computational Optimization and Applications in Engineering and Industry, Vol. 359. Studies in Computational Intelligence, Pp. 19-55. Springer Berlin Heidelberg (2011)

  13. Wang, Y., Shoemaker, C.A.: A general stochastic algorithmic framework for minimizing expensive black box objective functions based on surrogate models and sensitivity analysis. arXiv:2014arXiv1410.6271W, 1–33 (2014)

  14. Hong, H., Mahajan, A., Nekipelov, D.: Extremum estimation and numerical derivatives. J. Econ. 188 (1), 250–263 (2015). doi:10.1016/j.jeconom.2014.05.019

    Article  Google Scholar 

  15. Gilman, J.R., Ozgen, C.: Reservoir simulation: history matching and forecasting. Society of Petroleum Engineers, Richardson, TX (2013)

    Google Scholar 

  16. Landa, J.L.: Integration of Well Testing into Reservoir Characterization. In: Kamal (ed.) Transient Well Testing-Mongraph Series, vol. 23. Society of petroleum Engineers, USA (2009)

  17. Price, K., Storn, R.M., Lampinen, J.: Differential evolution: a practical approach to global optimization, 538 (2005)

  18. Forrester, A., Sobester, A., Keane, A.: Engineering design via surrogate modelling: a practical guide. Wiley (2008)

  19. Ciaurri, D.E., Isebor, O.J., Durlofsky, L.J.: Application of derivative-free methodologies to generally constrained oil production optimization problems. Procedia Computer Science 1, 1301–1310 (2010). doi:10.1016/j.procs.2010.04.145

  20. Rasmussen, C.E., Williams, C.K.I.: Gaussian processes for machine learning (adaptive computation and machine learning). The MIT Press (2005)

  21. Couckuyt, I., Dhaene, T., Demeester, P.: ooDACE toolbox: a flexible object-oriented Kriging implementation. J. Mach. Learn. Res. 15(1), 3183–3186 (2014)

    Google Scholar 

  22. Jones, D.: A taxonomy of global optimization methods based on response surfaces. J. Global Optim. 21(4), 345–383 (2001). doi:10.1023/A:1012771025575

  23. Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998). doi:10.1023/a:1008306431147

  24. Rasmussen, C.E.: Gaussian Processes in Machine Learning. In: Bousquet, O., von Luxburg, U., Rätsch, G. (eds.) Advanced Lectures on Machine Learning: ML Summer Schools 2003, Canberra, Australia, February 2 - 14, 2003, Tübingen, Germany, August 4–16, 2003, Revised Lectures, pp. 63–71. Springer, Berlin, Heidelberg (2004)

  25. Chiles, J.P., Delfiner, P.: Geostatistics: modeling spatial uncertainty, Volume 713 of Wiley Series in Probability and Statistics, 576 (2012)

  26. Azimi, J., Fern, A., Fern, X.: Batch Bayesian optimization via simulation matching Paper presented at the NIPS (2010)

  27. Bo, L., Qingfu, Z., Gielen, G.G.E.: A Gaussian process surrogate model assisted evolutionary algorithm for medium scale expensive optimization problems. IEEE Transactions on Evolutionary Computation 18(2), 180–192 (2014). doi:10.1109/TEVC.2013.2248012

    Article  Google Scholar 

  28. Lu, J., Li, B., Jin, Y.: An evolution strategy assisted by an ensemble of local Gaussian process models, Paper presented at the Proceedings of the 15th annual conference on Genetic and evolutionary computation, Amsterdam The Netherlands (2013)

  29. Lizotte, D.J., Wang, T., Bowling, M., Schuurmans, D.: Automatic gait optimization with Gaussian process regression, Paper presented at the IJCAI (2007)

  30. Chan, L.L.T., Liu, Y., Chen, J.: Nonlinear system identification with selective recursive gaussian process models. Ind. Eng. Chem. Res. 52(51), 18276–18286 (2013). doi:10.1021/ie4031538

  31. Gorissen, D., Couckuyt, I., Demeester, P., Dhaene, T., Crombecq, K.: A surrogate modeling and adaptive sampling toolbox for computer based design. J. Mach. Learn. Res. 11, 2051–2055 (2010)

    Google Scholar 

  32. Hamdi, H., Hajizadeh, Y., Azimi, J., Sousa, M.C.: Sequential Bayesian optimization coupled with differential evolution for geological well testing, Paper presented at the 76th EAGE Conference and Exhibition 2014 Amsterdam the Netherlands (2014)

  33. Tavassoli, Z., Carter, J.N., King, P.R.: An analysis of history matching errors. Comput. Geosci. 9(2), 99–123 (2005). doi:10.1007/s10596-005-9001-7

  34. Elahi, S.H., Jafarpour, B.: Characterization of fracture length and conductivity from tracer test and production data with ensemble Kalman Filter, Paper presented at the Unconventional Resources Technology Conference, San Antonio, Texas USA (2015)

  35. Anderson, D.M., Nobakht, M., Moghadam, S., Mattar, L.: Analysis of Production Data from Fractured Shale Gas Wells. SPE-131787-MS, Paper Presented at the SPE Unconventional Gas Conference, Pittsburgh, Pennsylvania, USA (2010)

  36. Orangi, A., Nagarajan, N.R., Honarpour, M.M., Rosenzweig, J.J.: Unconventional Shale Oil and Gas-Condensate Reservoir Production, Impact of Rock, Fluid, and Hydraulic Fractures. 140536-MS, Paper Presented at the SPE Hydraulic Fracturing Technology Conference, The Woodlands, Texas, USA (2011)

  37. Zhang, X., Du, C.M.: Sensitivity Analysis of Hydraulically Fractured Shale Gas Reservoirs. In: Ma, Y. Z., Pointe, P. R. (eds.) Uncertainty Analysis and Reservoir Modeling: AAPG Memoir 96. The American Association of Petroleum Geologists, USA (2011)

  38. Storn, R., Price, K.: Differential Evolution—A Simple and Efficient Adaptive Scheme for Global Optimization over Continuous Spaces. In: Technical Report TR-95-012. Berkeley (1995)

  39. Hamdi, H., Behmanesh, H., Clarkson, C.R., Costa Sousa, M.: Using differential evolution for compositional history-matching of a tight gas condensate well in the Montney Formation in western Canada. J. Nat. Gas Sci. Eng. 26, 1317–1331 (2015). doi:10.1016/j.jngse.2015.08.015

  40. Sacks, J., Welch, W., Mitchell, T., Wynn, H.: Design and analysis of computer experiments. Stat. Sci. 4(4), 409–423 (1989). doi:10.2307/2245858

  41. Mockus, J.: Application of Bayesian approach to numerical methods of global and stochastic optimization. J. Glob. Optim. 4(4), 347–365 (1994). doi:10.1007/bf01099263

  42. Handcock, M.S., Stein, M.L.: A Bayesian analysis of kriging. Technometrics 35(4), 403–410 (1993). doi:10.2307/1270273

  43. Helbert, C., Dupuy, D., Carraro, L.: Assessment of uncertainty in computer experiments from universal to Bayesian kriging. Appl. Stoch. Model. Bus. Ind. 25(2), 99–113 (2009). doi:10.1002/asmb.743

  44. Kleijnen, J.P.C.: Design and analysis of simulation experiments. Springer Publishing Company, Incorporated p. 218 (2007)

  45. Waller, L.A., Gotway, C.A.: Spatial Exposure Data. In: Applied Spatial Statistics for Public Health Data, pp. 272?324. Wiley (2004)

  46. MathWorks: Matlab Optimization Toolboxuser’s guide, Version 2014a (2014)

  47. Kawaguchi, K., Kaelbling, L.P., Lozano-Perez, T.: Bayesian Optimization with Exponential Convergence, Paper Presented at the Advances in Neural Information Processing Systems. arXiv:1604.01348, Montreal, Canada (2015)

  48. Ginsbourger, D., Le Riche, R., Carraro, L.: Kriging is Well-Suited to Parallelize Optimization. In: Tenne, Y., Goh, C. (eds.) Computational Intelligence in Expensive Optimization Problems, pp. 131–162. Springer, Berlin (2010)

  49. DUBOURG, V.: Adaptive surrogate models for reliability analysis and reliability-based design optimization. PhD thesis, université Blaise Pascal, Clermont-Ferrand, France, p. 282 (2011)

  50. Jeffreys, H.: An invariant form for the prior probability in estimation problems. Proc. R. Soc. Lond. A Math. Phys. Sci. 186(1007), 453–461 (1946)

    Article  Google Scholar 

  51. Zellner, A.: An introduction to Bayesian inference in econometrics. J. Wiley, p. 431 (1971)

  52. Moèkus, J.: On Bayesian Methods for Seeking the Extremum. In: Marchuk, G.I. (ed.) Optimization Techniques IFIP Technical Conference Novosibirsk, July 1–7, 1974, Vol. 27. Lecture Notes in Computer Science, Pp. 400–404. Springer, Berlin Heidelberg (1975)

  53. Bull, A.D.: Convergence rates of efficient global optimization algorithms. J. Mach. Learn. Res. 12, 2879–2904 (2011). arXiv:1101.3501

  54. Hu, L.Y., Blanc, G., Noetinger, B.: Gradual deformation and iterative calibration of sequential stochastic simulations. Math. Geol. 33(4), 475–489 (2001). doi:10.1023/a:1011088913233

  55. Caers, J.: Geostatistical History Matching under Training-Image Based Geological Model Constraints. 00077429, Paper Presented at the SPE Annual Technical Conference and Exhibition, San Antonio, Texas (2002)

  56. Hoffman, B.T., Caers, J.: Regional probability perturbations for history matching. J. Pet. Sci. Eng. 46 (1–2), 53–71 (2005). doi:10.1016/j.petrol.2004.11.001

  57. Hamdi, H., Hajizadeh, Y., Costa Sousa, M.: Population based sampling methods for geological well testing. Comput. Geosci. 19(5), 1089–1107 (2015). doi:10.1007/s10596-015-9522-7

  58. Hamdi, H., Sousa, M.C.: Calibrating Multi-Point Geostatistical Models Using Pressure Transient Data. SPE-180163-MS, Paper Presented at the SPE Europec Featured at 78Th EAGE Conference and Exhibition, Vienna, Austria (2016)

  59. Snoek, J., Rippel, O., Swersky, K., Kiros, R., Satish, N., Sundaram, N., Patwary, M.M.A., Prabhat, R.P.A.: Scalable Bayesian optimization using deep neural networks. arXiv:1502.05700 (2015)

  60. Hensman, J., Fusi, N., Lawrence, N.D.: Gaussian Processes for Big Data. arXiv:1309.6835 (2013)

  61. Wang, Z., Hutter, F., Zoghi, M., Matheson, D., de Feitas, N.: Bayesian optimization in a billion dimensions via random embeddings. J. Artif. Intell. Res. 55, 361–387 (2013)

  62. Carter, J.N., White, D.A.: History matching on the imperial college fault model using parallel tempering. Comput. Geosci. 17(1), 43–65 (2013). doi:10.1007/s10596-012-9313-3

  63. Li, R.: Conditioning geostatistical models to three-dimensional three-phase flow production data by automatic history matching. Ph.D thesis (2001)

  64. Oliver, D.S., Reynolds, A.C., Liu, N.: Inverse theory for petroleum reservoir characterization and history matching. Cambridge University Press, Cambridge (2008)

    Book  Google Scholar 

  65. Evensen, G.: The ensemble Kalman filter for combined state and parameter estimation. EEE Control Systems Magazine, pp. 83–104 (2009)

  66. Jansen, J.D.: Adjoint-based optimization of multi-phase flow through porous media. Comput. Fluids 46, 40–51 (2010)

    Article  Google Scholar 

  67. Emerick, A.A., Reynolds, A.C.: History matching time-lapse seismic data using the ensemble Kalman filter with multiple data assimilations. Comput. Geosci. 16(3), 639–659 (2012)

    Article  Google Scholar 

  68. Li, R., Reynolds, A.C., Oliver, D.S.: History matching of three-phase flow production data. SPEJ 8 (04), 328–340 (2003). doi:10.2118/87336-PA

  69. Zhang, F., Reynolds, A.C., Oliver, D.S.: The impact of upscaling errors on conditioning a stochastic channel to pressure data. SPE J. 8(01), 13–21 (2003). doi:10.2118/83679-PA

  70. Huguet, F., Lange, A., Egermann, P., Schaaf, T.: Automated History-Matching of Radial Models Using Self Organizing Classification Method for Underground Gas Storage Reservoirs Characterization. SPE-180177-MS, Paper Presented at the SPE Europec Featured at 78Th EAGE Conference and Exhibition, Vienna, Austria (2016)

  71. Shahverdi, H., Sohrabi, M.: Relative permeability characterization for water-alternating-gas injection in oil reservoirs. SPE J. 21(03), 799–808 (2016). doi:10.2118/166650-PA

  72. Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. Trans. Evol. Comp. 1(1), 67–82 (1997). doi:10.1109/4235.585893

  73. Caers, J.: The probability perturbation method—an alternative to a traditional bayesian approach for solving inverse problems, Paper presented at the ECMOR IX - 9th European Conference on the Mathematics of Oil Recovery Cannes France (2004)

  74. Mohamed, L., Christie, M.A., Demyanov, V., Robert, E., Kachuma, D.: Application of Particle Swarms for History Matching in the Brugge Reservoir. SPE-135264-MS, Paper Presented at the SPE Annual Technical Conference and Exhibition, Florence, Italy (2010)

  75. Honarkhah, M., Caers, J.: Stochastic simulation of patterns using distance-based pattern modeling. Math. Geosci. 42(5), 487–517 (2010). doi:10.1007/s11004-010-9276-7

  76. Sambridge, M.: Geophysical inversion with a neighbourhood algorithm—I. Searching a parameter space. Geophys. J. Int. 138(2), 479–494 (1999). doi:10.1046/j.1365-246X.1999.00876.x

  77. Eng Swee, S., Sasena, M., Volakis, J.L., Papalambros, P.Y., Wiese, R.W.: Fast parameter optimization of large-scale electromagnetic objects using DIRECT with Kriging metamodeling. IEEE Trans. Microwave Theory Tech. 52(1), 276–285 (2004). doi:10.1109/TMTT.2003.820891

  78. Couckuyt, I., Declercq, F., Dhaene, T., Rogier, H., Knockaert, L.: Surrogate-based infill optimization applied to electromagnetic problems. Int. J. RF Microwave Comput. Aided Eng. 20(5), 492–501 (2010). doi:10.1002/mmce.20455

  79. Luenberger, D.G.: Linear and nonlinear programming (2nd edition). Addison-wesley, p. 491 (2003)

  80. Nocedal, J., Wright, S.: Numerical optimization, 664 (2006)

  81. Al-Baali, M.: On the behaviour of a combined extra-updating/self-scaling BFGS method. J. Comput. Appl. Math. 134(1–2), 269–281 (2001). doi:10.1016/S0377-0427(00)00554-9

  82. Cheng, W.Y., Li, D.H.: Spectral scaling BFGS Method. J. Optim. Theory Appl. 146(2), 305–319 (2010). doi:10.1007/s10957-010-9652-y

  83. Oren, S.S., Luenberger, D.G.: Self-scaling variable metric (SSVM) algorithms. Manag. Sci. 20(5), 845–862 (1974). doi:10.1287/mnsc.20.5.845

  84. Oren, S.S.: Self-scaling variable metric (SSVM) algorithms. Manag. Sci. 20(5), 863–874 (1974). doi:10.1287/mnsc.20.5.863

  85. Al-Baali, M.: Numerical experience with a class of self-scaling quasi-Newton algorithms. J. Optim. Theory Appl. 96(3), 533–553 (1998). doi:10.1023/a:1022608410710

  86. Nocedal, J., Yuan, Y.-X.: Analysis of a self-scaling quasi-Newton method. Math. Program. 61(1), 19–37 (1993). doi:10.1007/bf01582136

  87. Oren, S.S., Spedicato, E.: Optimal conditioning of self-scaling variable metric algorithms. Math. Program. 10(1), 70–90 (1976). doi:10.1007/bf01580654

  88. Zhang, F., Reynolds, A.C.: Optimization Algorithms for Automatic History Matching of Production Data. In: ECMOR VIII-8Th European Conf. on the Mathematics of Oil Recovery, Freiberg, Germany, 3-6 September (2002)

  89. Schmidt, M.: minFunc: unconstrained differentiable multivariate optimization in Matlab. http://www.cs.ubc.ca/schmidtm/Software/minFunc.html (2005)

  90. Deng, W., Yang, X., Zou, L., Wang, M., Liu, Y., Li, Y.: An improved self-adaptive differential evolution algorithm and its application. Chemom. Intell. Lab. Syst. 128, 66–76 (2013). doi:10.1016/j.chemolab.2013.07.004

  91. Mohamed, A.W.: An improved differential evolution algorithm with triangular mutation for global numerical optimization. Comput. Ind. Eng. 85, 359–375 (2015). doi:10.1016/j.cie.2015.04.012

  92. Hu, Z., Xiong, S., Su, Q., Zhang, X.: Sufficient conditions for global convergence of differential evolution algorithm. J. Appl. Math. 2013, 14 (2013). doi:10.1155/2013/193196

  93. Vesterstrom, J., Thomsen, R.: A Comparative Study of Differential Evolution, Particle Swarm Optimization, and Evolutionary Algorithms on Numerical Benchmark Problems. In: Evolutionary Computation, 2004. CEC2004. Congress On, 19-23 June 2004, Pp. 1980-1987 Vol.1982, 19-23 June 2004 (2004)

  94. Das, S., Suganthan, P.N., Coello, C.A.C.: Guest editorial special issue on differential evolution. IEEE Trans. Evol. Comput. 15(1), 1–3 (2011). doi:10.1109/TEVC.2011.2108970

  95. Das, S., Suganthan, P.N.: Differential evolution: a survey of the state-of-the-art. IEEE Trans. Evol. Comput. 15(1), 4–31 (2011). doi:10.1109/TEVC.2010.2059031

  96. Storn, R., Price, K.: Minimizing the Real Functions of the ICEC’96 Contest by Differential Evolution. In: Evolutionary Computation, 1996., Proceedings of IEEE International Conference On, 20-22 May 1996, pp. 842–844 (1996)

  97. Hachicha, N., Jarboui, B., Siarry, P.: A fuzzy logic control using a differential evolution algorithm aimed at modelling the financial market dynamics. Inf. Sci. 181(1), 79–91 (2011). doi:10.1016/j.ins.2010.09.010

  98. Das, S., Sil, S.: Kernel-induced fuzzy clustering of image pixels with an improved differential evolution algorithm. Inf. Sci. 180(8), 1237–1256 (2010). doi:10.1016/j.ins.2009.11.041

  99. Hajizadeh, Y.: Population-based algorithms for improved history matching and uncertainty quantification of Petroleum reservoirs. PhD thesis, Heriot-Watt University, p. 315 (2011)

  100. Mirzabozorg, A., Nghiem, L., Yang, C., Chen, Z.: Differential Evolution for Assisted History Matching Process: SAGD Case Study. 165491-MS, Paper Presented at the SPE Heavy Oil Conference, Calgary, Alberta, Canada (2013)

  101. Hajizadeh, Y., Christie, M.A., Demyanov, V.: Application of Differential Evolution as a New Method for Automatic History Matching. SPE-127251-MS, Paper Presented at the Kuwait International Petroleum Conference and Exhibition, Kuwait City, Kuwait (2009)

  102. Gamperle, R., Müller, S. D., Koumoutsakos, P.: A Parameter Study for Differential Evolution. In: Proceedings of the 2002 Advances in Intelligent Systems, Fuzzy Systems, Evolutionary Computation, pp. 293–298. WSEAS Press, Interlaken (2002)

  103. Storn, R.: On the Usage of Differential Evolution for Function Optimization. In: Fuzzy Information Processing Society, 1996. NAFIPS., 1996 Biennial Conference of the North American, 19-22 Jun 1996, Pp. 519-523, 19-22 Jun 1996 (1996)

  104. Rönkkönen, J., Kukkonen, S., Price, K.V.: Real-Parameter Optimization with Differential Evolution. In: 2005 IEEE Congress on Evolutionary Computation, IEEE CEC 2005. Proceedings, pp. 506-513 (2005)

  105. Hu, Z., Su, Q., Yang, X., Xiong, Z.: Not guaranteeing convergence of differential evolution on a class of multimodal functions. Appl. Soft Comput. 41, 479–487 (2016). doi:10.1016/j.asoc.2016.01.001

  106. Yi, W., Zhou, Y., Gao, L., Li, X., Mou, J.: An improved adaptive differential evolution algorithm for continuous optimization. Expert Syst. Appl. 44, 1–12 (2016). doi:10.1016/j.eswa.2015.09.031

  107. Gong, W., Cai, Z., Wang, Y.: Repairing the crossover rate in adaptive differential evolution. Appl. Soft Comput. 15, 149–168 (2014). doi:10.1016/j.asoc.2013.11.005

  108. Braham, H., Jemaa, S.B., Sayrac, B., Fort, G., Moulines, E.: Low Complexity Spatial Interpolation for Cellular Coverage Analysis. In: Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (Wiopt), 2014 12Th International Symposium On, 12-16 May 2014, Pp. 188-195, 12-16 May 2014 (2014)

  109. Memarsadeghi, N., Raykar, V.C., Duraiswami, R., Mount, D.: Efficient Kriging via Fast Matrix-Vector Products. In: Aerospace Conference, 2008 IEEE (2008)

  110. Srinivasan, B.R.D., Murtugudde, R.: Efficient Kriging for Real-Time Spatio-Temporal Interpolation. In: Proceedings of 20Th Conference on Probability and Statistics in the Atmospheric Sciences, Pp. 228–235. American Meteorological Society (2010)

  111. Yang, C., Duraiswami, R., Davis, L.S.: Efficient kernel machines using the improved fast Gauss transform. Advances in neural information processing systems 17 (2005)

  112. Vazquez, E., Bect, J.: Convergence properties of the expected improvement algorithm with fixed mean and covariance functions. J. Stat. Planning Inference 140(11), 3088–3095 (2010). doi:10.1016/j.jspi.2010.04.018

  113. Cardoso, M.A.: Development and application of reduced-order modeling procedures for reservoir simulation. Stanford University (2009)

  114. Adams, C.: The Status of Drilling and Production in BC’S Shale Gas Plays. In: 5Th Northeast B.C. Natural Gas Summit, Vancouver, BC, Canada, September 23–24, September 23–24 (2013)

  115. Barree, R.D.: Overview of Current DFIT Analysis Methodology. http://eo2.commpartners.com/users/spe/session.php?id=10329 (2013)

  116. Whitson, C.H., Sunjerga, S.: PVT in Liquid-Rich Shale Reservoirs. 155499-MS, Paper Presented at the SPE Annual Technical Conference and Exhibition, San Antonio, Texas, USA (2012)

  117. Peres, A.M.M., Macias-Chapa, L., Serra, K.V., Reynolds, A.C.: Well-conditioning effects on bubblepoint pressure of fluid samples from solution-gas-drive reservoirs. SPE Form. Eval., 389–398 (1990). doi:10.2118/18530-PA

  118. Behmanesh, H., Hamdi, H., Clarkson, C.R.: Production data analysis of tight gas condensate reservoirs. J. Nat. Gas Sci. Eng. 22(0), 22–34 (2015). doi:10.1016/j.jngse.2014.11.005

  119. Boe, A., Skjaeveland, S.M., Whitson, C.H.: Two-phase pressure test analysis. SPE Form. Eval. 4(4), 604–610 (1989). doi:10.2118/10224-pa

  120. Robinson, D.B., Peng, D.-Y.: The characterization of the heptanes and heavier fractions for the GPA Peng-Robinson programs. Gas Processors Association, Tulsa, Okla (1978)

  121. Rock Flow Dynamics: tNavigator reservoir simulator’s user manual (v.4.2.4) (2016)

  122. Palacio, J.C., Blasingame, T.A.: Decline-curve analysis with type curves—analysis of gas well production data. 25909-MS, Paper presented at the Low Permeability Reservoirs Symposium, Denver, Colorado USA (1993)

  123. Yilmaz, O., Nur, A., Nolen-Hoeksema, R.: Pore pressure profiles in fractured and compliant rooks. in society of petroleum engineers (1991)

  124. Dinh, V.P., Gouge, B.A., White, A.J.: Estimating Long Term Well Performance in the Montney Shale Gas Reservoir. 169843-MS, Paper Presented at the SPE Hydrocarbon Economics and Evaluation Symposium, Houston, Texas, USA (2014)

  125. Corey, A.T.: The interrelation between gas and oil relative permeabilities. Producers Monthly 19, 38–41 (1954)

    Google Scholar 

  126. Juell, A.O., Whitson, C.H.: Optimized Well Modeling of Liquid-Rich Shale Reservoirs. 166380-MS, Paper Presented at the SPE Annual Technical Conference and Exhibition, New Orleans, Louisiana, USA (2013)

  127. Frigola-Alcalde, R.: Bayesian time series learning with Gaussian processes. PhD thesis, University of Cambridge, p. 97 (2015)

  128. Brahim-Belhouari, S., Bermak, A.: Gaussian process for nonstationary time series prediction. Comput. Stat. Data Anal. 47(4), 705–712 (2004). doi:10.1016/j.csda.2004.02.006

  129. Wilson, A.G., Adams, R.P.: Gaussian process kernels for pattern discovery and extrapolation. arXiv:1302.4245 (2013)

  130. Mohammad, Y., Nishida, T.: Data mining for social robotics: toward autonomously social robots. Springer International Publishing, p. 328 (2016)

  131. Stein, M.: Interpolation of spatial data: some theory for kriging. Springer Series in Statistics. Springer Science & Business Media, N. Y., p. 247 (1999)

  132. Snoek, J., Larochelle, H., Adams, R.: Practical Bayesian optimization of machine learning algorithms. In (2012)

  133. Brochu, E., Cora, V.M., de Freitas, N.: A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. CoRR 1012.2599(2010)

  134. Vaart, A.W.V.D., Zanten, J.H.V.: Adaptive Bayesian estimation using a Gaussian random field with inverse Gamma bandwidth, arXiv:0908.3556. Ann. Stat. 37(5B), 2655–2675 (2009)

    Article  Google Scholar 

  135. Marmin, S., Chevalier, C., Ginsbourger, D.: Differentiating the Multipoint Expected Improvement for Optimal Batch Design. In: Pardalos, P., Pavone, M., Farinella, G. M., Cutello, V (eds.) Machine Learning, Optimization, and Big Data: First International Workshop, MOD 2015, Taormina, Sicily, Italy, July 21-23, 2015, Revised Selected Papers, 37–48, Springer International Publishing, Cham (2015)

  136. Picheny, V., Wagner, T., Ginsbourger, D.: A benchmark of kriging-based infill criteria for noisy optimization. Struct. Multidiscip. Optim. 48(3), 607–626 (2013). doi:10.1007/s00158-013-0919-4

  137. Gelfand, A.E., Diggle, P., Guttorp, P., Fuentes, M.: Handbook of spatial statistics. CRC Press, p. 619 (2010)

  138. Kao, Y.: ST733 – Applied Spatial statistics – Spring 2013, http://www4.ncsu.edu/ykao/docs/lab%203/estimation%20and%20modeling%20of%20sptial%20correlation.pdf, NC State University (2013)

  139. Pistone, G., Vicario, G.: Comparing and generating Latin hypercube designs in kriging models. AStA Adv. Stat. Anal. 94(4), 353–366 (2010). doi:10.1007/s10182-010-0142-1

  140. Morris, M.D., Mitchell, T.J.: Exploratory designs for computational experiments. J. Stat. Planning and Inference 43(3), 381–402 (1995). doi:10.1016/0378-3758(94)00035-T

  141. Kleijnen, J.P.C., van Beers, W., van Nieuwenhuyse, I.: Expected improvement in efficient global optimization through bootstrapped kriging. J. Glob. Optim. 54(1), 59–73 (2012). doi:10.1007/s10898-011-9741-y

  142. Murray, I., Adams, R.P.: Slice sampling covariance hyperparameters of latent Gaussian models. arXiv:1006.0868 (2010)

  143. Marelli, S., Sudret, B.: UQLab: A framework for uncertainty quantification in Matlab, Vulnerability, Uncertainty, and Risk (Proc. 2nd Int. Conf. on Vulnerability, Risk Analysis and Management (ICVRAM2014)), pp. 2554-2563 ETH-zürich (2014)

  144. Lataniotis, C., Marelli, S., Sudret, B.: UQLab user manual—kriging, chair of risk, safety & uncertainty quantification, ETH Zurich report# UQLab-v0.9-105 (2015)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hamidreza Hamdi.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hamdi, H., Couckuyt, I., Sousa, M.C. et al. Gaussian Processes for history-matching: application to an unconventional gas reservoir. Comput Geosci 21, 267–287 (2017). https://doi.org/10.1007/s10596-016-9611-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10596-016-9611-2

Keywords

Navigation