Skip to main content
Log in

A machine-learning-accelerated distributed LBFGS method for field development optimization: algorithm, validation, and applications

  • Original Paper
  • Published:
Computational Geosciences Aims and scope Submit manuscript

Abstract

We have developed a support vector regression (SVR) accelerated variant of the distributed derivative-free optimization (DFO) method using the limited-memory BFGS Hessian updating formulation (LBFGS) for subsurface field-development optimization problems. The SVR-enhanced distributed LBFGS (D-LBFGS) optimizer is designed to effectively locate multiple local optima of highly nonlinear optimization problems subject to numerical noise. It operates both on single- and multiple-objective field-development optimization problems. The basic D-LBFGS DFO optimizer runs multiple optimization threads in parallel and uses the linear interpolation method to approximate the sensitivity matrix of simulated responses with respect to optimized model parameters. However, this approach is less accurate and slows down convergence. In this paper, we implement an effective variant of the SVR method, namely ε-SVR, and integrate it into the D-LBFGS engine in synchronous mode within the framework of a versatile optimization library inside a next-generation reservoir simulation platform. Because ε-SVR has a closed-form of predictive formulation, we analytically calculate the approximated objective function and its gradients with respect to input model variables subject to optimization. We investigate two different methods to propose a new search point for each optimization thread in each iteration through seamless integration of ε-SVR with the D-LBFGS optimizer. The first method estimates the sensitivity matrix and the gradients directly using the analytical ε-SVR surrogate and then solves a LBFGS trust-region subproblem (TRS). The second method applies a trust-region search LBFGS method to optimize the approximated objective function using the analytical ε-SVR surrogate within a box-shaped trust region. We first show that ε-SVR provides accurate estimates of gradient vectors on a set of nonlinear analytical test problems. We then report the results of numerical experiments conducted using the newly proposed SVR-enhanced D-LBFGS algorithms on both synthetic and realistic field-development optimization problems. We demonstrate that these algorithms operate effectively on realistic nonlinear optimization problems subject to numerical noise. We show that both SVR-enhanced D-LBFGS variants converge faster and thereby provide a significant acceleration over the basic implementation of D-LBFGS with linear interpolation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Data availability

My manuscript has no associated data.

References

  1. Alpak, F.O. Simultaneous optimization of well count and placement: algorithm, validation, and field testing. SPE J. 28(01), 147–172 (2023)

  2. Alpak, F.O., Jain, V.: Support-vector regression accelerated well location optimization: algorithm, validation, and field testing. Comput. Geosci. 25, 2033–2054 (2021)

    Article  Google Scholar 

  3. Alpak, F., Jain, V., Wang, Y., Gao, G.: Biobjective optimization of well placement: algorithm, validation, and field testing. SPE J. 27(01), 246–273 (2022)

    Article  Google Scholar 

  4. Alpak, F., Wang, Y., Gao, G., and Vivek, J. Benchmarking and Field-Testing of the Distributed Quasi-Newton Derivative-Free Optimization Method for Field Development Optimization. Paper SPE-206267-MS presented at the SPE Annual Technical Conference and Exhibition, Dubai, UAE (2021)

  5. Bertsekas, D. Nonlinear Programming (3rd ed.). Athena Scientific. ISBN 9781886529007 (2016)

  6. Buhmann, M.D.: Radial Basis Functions: Theory and Implementations. Cambridge University Press, Cambridge, England, Cambridge Monographs on Applied and Computational Mathematics (2003)

  7. Chang, C.-C. and Lin, C.-J. LIBSVM: A library for support vector machines. ACM Trans Intell Syst Technol 2(3), article no. 27: 1–27 (2011)

  8. Chen, C., Jin, L., Gao, G., Weber, D., Vink, J.C., Hohl, D.F., Alpak, F.O., Pirmez, C. Assisted History Matching Using Three Derivative-Free Optimization Algorithms. Paper SPE-154112-MS presented at the SPE Europec/EAGE Annual Conference held in Copenhagen, Denmark, 4–7 June (2012)

  9. Chen, Y., Oliver, D.S., Zhang, D.: Efficient ensemble-based closed-loop production optimization. SPE J. 14(4), 634–645 (2009)

    Article  Google Scholar 

  10. Chen, Y., Oliver, D.S.: Ensemble-based closed-loop optimization applied to Brugge field. SPE Reservoir Eng Eval. 13(1), 56–71 (2010)

    Article  Google Scholar 

  11. Chen, B., Reynolds, A.C.: Ensemble-based optimization of the water-alternating-gas-injection process. SPE J. 21(03), 786–798 (2016)

    Article  Google Scholar 

  12. Conn, A.R., Scheinberg, K., Toint, P.L.: Recent Progress in unconstrained nonlinear optimization without derivatives. Math. Program. 79, 397–414 (1997)

    Article  Google Scholar 

  13. Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)

    Article  Google Scholar 

  14. De Brabanter, K.: Least-Squares Support Vector Regression with Applications to Large-Scale Data: a Statistical Approach. Faculty of Engineering, Katholieke Universiteit Leuven, Leuven, Belgium, Ph. D. dissertation (2011)

    Google Scholar 

  15. Do, S.T., Reynolds, A.C.: Theoretical connections between optimization algorithms based on an approximation gradient. Comput. Geosci. 17(6), 959–973 (2013)

    Article  Google Scholar 

  16. Dyn, N. Interpolation of Scattered Data by Radial Functions. Topics in Multivariate Approximation: 47–61 (1987)

  17. Fonseca, R.R., Chen, B., Jansen, J.D., Reynolds, A.C.: A stochastic simplex approximate gradient (StoSAG) for optimization under uncertainty. Int J Num Methods Eng. 109, 1756–1776 (2017)

    Article  Google Scholar 

  18. Gao, G., Vink, J.C., Alpak, F.O., Mo, W.: An efficient optimization workflow for field-scale in-situ upgrading developments. SPE J. 20(04), 701–716 (2015)

    Article  Google Scholar 

  19. Gao, G., Vink, J.C., Chen, C., Alpak, F.O, and Du, K. A parallelized and hybrid data-integration algorithm for history matching of geologically complex reservoirs. SPE J. 21(06): 2155–2174 (2016)

  20. Gao, G., Vink, J.C., Chen, C., El Khamra, Y., Tarrahi, M.: Distributed gauss-Newton optimization method for history matching problems with multiple best matches. Comput. Geosci. 21(5–6), 1325–1342 (2017a)

    Article  Google Scholar 

  21. Gao, G., Jiang, H., van Hagen, P.H., Vink, J.C., Wells, T.J.: A gauss-Newton Trust region solver for large scale history matching problems. SPE J. 22(06), 1999–2011 (2017b)

    Article  Google Scholar 

  22. Gao, G., Wang, Y., Vink, J.C., Wells, T.J., Saaf, F.: Distributed quasi-Newton derivative-free optimization method for optimization problems with multiple local optima. Comput. Geosci. 26, 847–863 (2022)

    Article  Google Scholar 

  23. Gao, G., Florez, H., Vink, J.C., Wells, T.J., Saaf, F., Blom, C.: Performance analysis of trust region subproblem solvers for limited-memory distributed BFGS optimization method. Front Appl Math Stat. 7, 673412 (2021)

    Article  Google Scholar 

  24. Gao, G., Jiang, H., Vink, J.C., Chen, C., El Khamra, Y., Ita, J.J.: Gaussian mixture model fitting method for uncertainty quantification by conditioning to production data. Comput. Geosci. 24(2), 663–681 (2020)

    Article  Google Scholar 

  25. Guo, Z., Chen, C., Gao, G., Cao, R., Li, R., Liu, H.: Integration of support vector regression with distributed gauss-Newton optimization method and its applications to the uncertainty assessment of unconventional assets. SPE Reserv. Eval. Eng. 21(4), 1007–1026 (2018a)

    Article  Google Scholar 

  26. Guo, Z., Chen, C., Gao, G., Vink, J.: Enhancing the performance of the distributed gauss-Newton optimization method by reducing the effect of numerical noise and truncation error with support-vector-regression. SPE J. 23(06), 2428–2443 (2018b)

    Article  Google Scholar 

  27. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer-Verlag, New York (2009)

    Book  Google Scholar 

  28. Jansen, J.D.: Adjoint-based optimization of multi-phase flow through porous media—a review. Comput. Fluids. 46(1), 40–51 (2011)

    Article  Google Scholar 

  29. Joachims, T. Making large-Scale SVM Learning Practical. Advances in Kernel Methods – Support Vector Learning, B. Schölkopf and C. Burges and A. Smola (ed.), MIT-Press (1999)

  30. Karush, W. Minima of Functions of Several Variables with Inequalities as Side Constraints (M.Sc. thesis). Department of Mathematics, University of Chicago, Chicago, Illinois (1939)

  31. Kowalik, J.S., Osborne, M.R.: Methods for Unconstrained Optimization Problems. New York, New York, Elsevier North-Holland (1978)

    Google Scholar 

  32. Kuhn, H.W. and Tucker, A.W. Nonlinear programming. Proceedings of 2nd Berkeley Symposium. Berkeley: University of California Press: 481–492. MR 0047303 (1951)

  33. Lanczos, C. Applied Analysis. Englewood Cliffs, NJ: Prentice Hall: 272–280 (1956)

  34. Lu, H., Gao, G., Florez, H., Vink, J.C., Blom, C., Wells, T.J., and Saaf, F. Solving Gauss-Newton Trust Region Subproblem with Bound Constraints. Paper presented in the 18th Europe Conference on the Mathematics of Geological Reservoirs held in the Hague, the Netherlands, 5–7 September (2022)

  35. Lu, R., Reynolds, A.C.: Joint optimization of well locations, types, drilling order, and controls given a set of potential drilling paths. SPE J. 25(03), 1285–1306 (2020)

    Article  Google Scholar 

  36. Mercer, J.: XVI. Functions of positive and negative type, and their connection with the theory of integral equations. Philosophical transactions of the Royal Society of London. Series a. Containing Papers Math Phys Character. 209, 415–446 (1909)

    Google Scholar 

  37. Micchelli, C. A. Interpolation of Scattered Data: Distance Matrices and Conditionally Positive Definite Functions. In Approximation Theory and Spline Functions: 143–145, Springer, The Netherlands (1984)

  38. NIST Lanczos-3: https://www.itl.nist.gov/div898/strd/nls/data/lanczos3.shtml. Last accessed on 03-June-2022a

  39. NIST MGH09: https://www.itl.nist.gov/div898/strd/nls/data/mgh09.shtml. Last accessed on 16-December-2022b

  40. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York, New York (1999)

    Book  Google Scholar 

  41. Oeuvray, R. Trust Region Methods Based on Radial basis Functions with Application to Biomedical Imaging. Ph.D. thesis, EPFL, Lausanne, Switzerland (2005)

  42. Oeuvray, R., Bierlaire, M.: BOOSTER: a derivative free algorithm based on radial basis functions. Int J Model Simul. 29(1), 26–36 (2009)

    Article  Google Scholar 

  43. Oliver, D.S.: Multiple realization of the permeability field from well-test data. SPE J. 1(2), 145–155 (1996)

    Article  Google Scholar 

  44. Oliver, D.S., Chen, Y.: Recent Progress on reservoir history matching: a review. Comput. Geosci. 15(1), 185–211 (2011)

    Article  Google Scholar 

  45. Oliver, D.S., Reynolds, A.C., and Liu. Inverse Theory for Petroleum Reservoir Characterization and History Matching. Cambridge University Press (2008)

  46. Platt, J. Sequential Minimal Optimization: A Fast Algorithm for Training Support Vector Machines. Microsoft Research, Technical Report: MSR-TR-98-14 (1998)

  47. Powell, M.J.D.: On the Use of Quadratic Models in Unconstrained Minimization without Derivatives. Paper presented at the First International Conference on Optimization Methods and Software held in Hangzhou, China, Dec (2002)

    Google Scholar 

  48. Powell, M.J.D.: Least Frobenius norm updating of quadratic models that satisfy interpolation conditions. Math. Program. 100(1), 183–215 (2004)

    Article  Google Scholar 

  49. Rafiee, J. and Reynolds, A.C.: A Two-Level MCMC Based on the Distributed Gauss-Newton Method for Uncertainty Quantification. The 16th European Conference on the Mathematics of Oil Recovery, Barcelona, Spain, 3–6 September (2018)

  50. Rastrigin, L.A.: Systems of Extremal Control. Mir, Moscow (1974)

    Google Scholar 

  51. Regis, R.G., Shoemaker, C.A.: Improved strategies for radial basis function methods for global optimization. J. Glob. Optim. 37(1), 113–135 (2007)

    Article  Google Scholar 

  52. Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Stat. Comput. 14(3), 199–222 (2004)

    Article  Google Scholar 

  53. Suykens, J.A., De Brabanter, J., Lukas, L., Vandewalle, J.: Weighted least squares support vector machines: robustness and sparse approximation. Neurocomputing. 48(1–4), 85–105 (2002)

    Article  Google Scholar 

  54. Suykens, J.A., Vandewalle, J.: Least squares support vector machine classifiers. Neural. Process. Lett. 9(3), 293–300 (1999)

    Article  Google Scholar 

  55. Tarantola, A.: Inverse problem theory and methods for model parameter estimation. SIAM. (2005)

  56. Wang, Y., Alpak, F., Gao, G., Chen, C., Vink, J., Wells, T., Saaf, F.: An efficient bi-objective optimization workflow using the distributed quasi-Newton method and its application to well-location optimization. SPE J. 27(01), 364–380 (2022)

    Article  Google Scholar 

  57. Wild, S.M.: MNH: a Derivative Free Optimization Algorithm Using Minimal Norm Hessians. Tenth Copper Mountain Conference on Iterative Methods, April (2008)

    Google Scholar 

  58. Wild, S.M. Derivative Free Optimization Algorithms for Computationally Expensive Functions. Ph.D Thesis, Cornell University (2009)

  59. Wild, S.M., Regis, R.G., Shoemaker, C.A.: ORBIT: optimization by radial basis interpolation in trust-region. SIAM J. Sci. Comput. 30(6), 3197–3219 (2008)

    Article  Google Scholar 

  60. Zhao, H., Li, G., Reynolds, A.C., Yao, J.: Large-scale history matching with quadratic interpolation models. Comput. Geosci. 17, 117–138 (2012)

    Article  Google Scholar 

Download references

Acknowledgments

The authors would like to thank Shell International Exploration and Production Inc. for permission to publish this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Faruk Alpak.

Ethics declarations

Competing interests

The authors declare that they have no known competing funding/financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Alpak, F., Gao, G., Florez, H. et al. A machine-learning-accelerated distributed LBFGS method for field development optimization: algorithm, validation, and applications. Comput Geosci 27, 425–450 (2023). https://doi.org/10.1007/s10596-023-10197-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10596-023-10197-3

Keywords

Navigation