Large scale variable fidelity surrogate modeling

Abstract

Engineers widely use Gaussian process regression framework to construct surrogate models aimed to replace computationally expensive physical models while exploring design space. Thanks to Gaussian process properties we can use both samples generated by a high fidelity function (an expensive and accurate representation of a physical phenomenon) and a low fidelity function (a cheap and coarse approximation of the same physical phenomenon) while constructing a surrogate model. However, if samples sizes are more than few thousands of points, computational costs of the Gaussian process regression become prohibitive both in case of learning and in case of prediction calculation. We propose two approaches to circumvent this computational burden: one approach is based on the Nyström approximation of sample covariance matrices and another is based on an intelligent usage of a blackbox that can evaluate a low fidelity function on the fly at any point of a design space. We examine performance of the proposed approaches using a number of artificial and real problems, including engineering optimization of a rotating disk shape.

This is a preview of subscription content, access via your institution.

References

  1. 1.

    Ageev, N., Ageev, N., Pavlenko, A., Pavlenko, A.: Minimization of body of revolution aerodynamic drag at supersonic speeds. Aircraft Engineering and Aerospace Technology: An International Journal 88(2), 246–256 (2016)

    Article  Google Scholar 

  2. 2.

    Alestra, S., Kapushev, E., Belyaev, M., Burnaev, E., Dormieux, M., Cavailles, A., Chaillot, D., Ferreira, E.: Building data fusion surrogate models for spacecraft aerodynamic problems with incomplete factorial design of experiments. Adv. Mater. Res. 1016, 405–412 (2014)

    Article  Google Scholar 

  3. 3.

    Alexandrov, N.M., Nielsen, E.J., Lewis, R.M., Anderson, W.K.: First-order model management with variable-fidelity physics applied to multi-element airfoil optimization. Tech. rep. NASA (2000)

  4. 4.

    Álvarez, M., Lawrence, N.: Computationally efficient convolved multiple output Gaussian processes. J. Mach. Learn. Res. 12, 1425–1466 (2011)

    MathSciNet  MATH  Google Scholar 

  5. 5.

    Armand, S.: Structural Optimization Methodology for Rotating Disks of Aircraft Engines. Tech. Rep., National Aeronautics and Space Administration, Office of Management, Scientific and Technical Information Program (1995)

  6. 6.

    Bachoc, F.: Cross validation and maximum likelihood estimations of hyper-parameters of Gaussian processes with model misspecification. Comput. Stat. Data Anal. 66, 55–69 (2013)

    MathSciNet  Article  Google Scholar 

  7. 7.

    Banerjee, S., Gelfand, A., Finley, A., Sang, H.: Gaussian predictive process models for large spatial data sets. J. R. Stat. Soc. Ser. B (Stat Methodol.) 70 (4), 825–848 (2008)

    MathSciNet  Article  MATH  Google Scholar 

  8. 8.

    Belyaev, M., Burnaev, E., Kapushev, E., Panov, M., Prikhodko, P., Vetrov, D., Yarotsky, D.: Gtapprox: surrogate modeling for industrial design. Adv. Eng. Softw. 102, 29–39 (2016)

    Article  Google Scholar 

  9. 9.

    Belyaev, M., Burnaev, E., Kapushev, Y.: Gaussian process regression for structured data sets. In: Gammerman, A. et al. (eds.) Lecture Notes in Artificial Intelligence. Proceedings of SLDS 2015, vol. 9047, pp 106–115. Springer, London, UK (2015)

    Google Scholar 

  10. 10.

    Bergh, J., Meyer, C., Snedden, G.: Global optimisation of expensive objective functions using cfd and kriging models ANSYS User Group Meeting June 2014 (2014)

  11. 11.

    Bishop, C.: Pattern Recognition and Machine Learning. Springer, New York (2006)

    MATH  Google Scholar 

  12. 12.

    Boyle, P., Frean, M.: Dependent Gaussian processes. Adv. Neural Inf. Proces. Syst. 17, 217–224 (2005)

    Google Scholar 

  13. 13.

    Burnaev, E., Belyaev, M., Kapushev, E.: Computationally efficient algorithm for gaussian processes based regression in case of structured samples. Comput. Math. Math. Phys. 56(4), 499–513 (2016)

    MathSciNet  Article  MATH  Google Scholar 

  14. 14.

    Burnaev, E., Erofeev, P.: The influence of parameter initialization on the training time and accuracy of a nonlinear regression model. J. Commun. Technol. Electron. 61 (6), 646–660 (2016)

    Article  Google Scholar 

  15. 15.

    Burnaev, E., Nazarov, I.: Conformalized kernel ridge regression. In: Proceedings of the IEEE International Conference on Machine Learning and Applications, pp. 45–52. IEEE Computer Society (2016)

  16. 16.

    Burnaev, E., Panin, I.: Adaptive design of experiments for sobol indices estimation based on quadratic metamodel. In: Gammerman, A., et al. (eds.) Proceedings of SLDS 2015, Lecture Notes in Artificial Intelligence, vol. 9047, p. 8696. Springer, London, UK (2015)

  17. 17.

    Burnaev, E., Panin, I., Sudret, B.: Effective design for sobol indices estimation based on polynomial chaos expansions. In: Gammerman, A., et al. (eds.) Proceedings of COPA 2016, Lecture Notes in Artificial Intelligence, vol. 9653, p. 165184. Springer, Spain, Barcelona (2016)

  18. 18.

    Burnaev, E., Panin, I., Sudret, B.: Effecient design of experiments for sensitivity analysis based on polynomial chaos expansions. Accepted in Annals of Mathematics and Artificial Intelligence (2017)

  19. 19.

    Burnaev, E., Panov, M.: Adaptive design of experiments based on Gaussian processes Statistical Learning and Data Sciences, pp 116–125. Springer (2015)

  20. 20.

    Burnaev, E., Panov, M., Zaytsev, A.: Regression on the basis of nonstationary Gaussian processes with Bayesian regularization. J. Commun. Technol. Electron. 61(6), 661–671 (2016)

    Article  Google Scholar 

  21. 21.

    Burnaev, E., Vovk, V.: Efficiency of conformalized ridge regression JMLR W&CP, vol. 35, pp 605–622 (2014)

  22. 22.

    Burnaev, E., Yanovich, Y.: Sparse Gaussian processes MIPT-54 (2011)

  23. 23.

    Burnaev, E., Zaytsev, A.: Surrogate modeling of multifidelity data for large samples. J. Commun. Technol. Electron. 60(12), 1348–1355 (2015)

    Article  Google Scholar 

  24. 24.

    Burnaev, E., Zaytsev, A., Spokoiny, V.: The Bernstein-von Mises theorem for regression based on Gaussian processes. Russ. Math. Surv. 68(5), 954–956 (2013)

    MathSciNet  Article  MATH  Google Scholar 

  25. 25.

    Chang, W., Haran, M., Olson, R., Keller, K., et al.: Fast dimension-reduced climate model calibration and the effect of data aggregation. Ann. Appl. Stat. 8(2), 649–673 (2014)

    MathSciNet  Article  MATH  Google Scholar 

  26. 26.

    Doyen, P.: Porosity from seismic data: a geostatistical approach. Geophysics 53 (10), 1263–1275 (1988)

    Article  Google Scholar 

  27. 27.

    Drineas, P., Mahoney, M.: On the nyström method for approximating a Gram matrix for improved kernel-based learning. J. Mach. Learn. Res. 6, 2153–2175 (2005)

    MathSciNet  MATH  Google Scholar 

  28. 28.

    Druot, T., Alestra, S., Brand, C., Morozov, S.: Multi-objective optimization of aircrafts family at conceptual design stage Inverse Problems, Design and Optimization Symposium. Albi, France (2013)

  29. 29.

    Farshi, B., Jahed, H., Mehrabian, A.: Optimum design of inhomogeneous non-uniform rotating discs. Comput. Struct. 82(9), 773–779 (2004)

    Article  Google Scholar 

  30. 30.

    Forrester, A., Sóbester, A., Keane, A.: Multi-fidelity optimization via surrogate modelling. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Science 463(2088), 3251–3269 (2007)

    MathSciNet  Article  MATH  Google Scholar 

  31. 31.

    Forrester, A., Sóbester, A., Keane, A.: Engineering Design via Surrogate Modelling: a Practical Guide. J Wiley (2008)

  32. 32.

    Foster, L., Waagen, A., Aijaz, N., Hurley, M., Luis, A., Rinsky, J., Satyavolu, C., Way, M., Gazis, P., Srivastava, A.: Stable and efficient Gaussian process calculations. J. Mach. Learn. Res. 10, 857–882 (2009)

    MathSciNet  MATH  Google Scholar 

  33. 33.

    Furrer, R., Genton, M., Nychka, D.: Covariance tapering for interpolation of large spatial datasets. J. Comput. Graph. Stat. 15(3) (2006)

  34. 34.

    Golub, G., Van Loan, C.: Matrix Computations, vol. 3. JHU Press (2012)

  35. 35.

    Gorissen, D., Couckuyt, I., Demeester, P., Dhaene, T., Crombecq, K.: A surrogate modeling and adaptive sampling toolbox for computer based design. J. Mach. Learn. Res. 11, 2051–2055 (2010)

    Google Scholar 

  36. 36.

    Grihon, S., Burnaev, E., Belyaev, M., Prikhodko, P.: Surrogate modeling of stability constraints for optimization of composite structures. In: Koziel, S., Leifsson, L. (eds.) Surrogate-Based Modeling and Optimization. Engineering Applications, pp 359–391. Springer (2013)

  37. 37.

    Han, Z., Görtz, S., Zimmermann, R.: Improving variable-fidelity surrogate modeling via gradient-enhanced kriging and a generalized hybrid bridge function. Aerosp. Sci. Technol. 25(1), 177–189 (2013)

    Article  Google Scholar 

  38. 38.

    Hastie, T., Tibshirani, R., Friedman, J., Franklin, J.: The elements of statistical learning: data mining, inference and prediction. Math. Intell. 27(2), 83–85 (2005)

    Google Scholar 

  39. 39.

    Hensman, J., Fusi, N., Lawrence, N.: Gaussian processes for big data. arXiv:1309.6835 (2013)

  40. 40.

    Higdon, D., Gattiker, J., Williams, B., Rightley, M.: Computer model calibration using high-dimensional output. J. Am. Stat. Assoc. 103(482) (2008)

  41. 41.

    Huang, Z., Wang, C., Chen, J., Tian, H.: Optimal design of aeroengine turbine disc based on kriging surrogate models. Comput. Struct. 89(1), 27–37 (2011)

    Article  Google Scholar 

  42. 42.

    Kawai, S., Shimoyama, K.: Kriging-model-based uncertainty quantification in computational fluid dynamics 32nd AIAA Applied Aerodynamics Conference (2013)

  43. 43.

    Kennedy, M., O’Hagan, A.: Predicting the output from a complex computer code when fast approximations are available. Biometrika 87(1), 1–13 (2000)

    MathSciNet  Article  MATH  Google Scholar 

  44. 44.

    Kumar, S., Mohri, M., Talwalkar, A.: Sampling methods for the nyström method. J. Mach. Learn. Res. 13, 981–1006 (2012)

    MathSciNet  MATH  Google Scholar 

  45. 45.

    Madsen, J., Langthjem, M.: Multifidelity response surface approximations for the optimum design of diffuser flows. Optim. Eng. 2(4), 453–468 (2001)

    Article  MATH  Google Scholar 

  46. 46.

    Martin, J., Simpson, T.: Use of kriging models to approximate deterministic computer models. AIAA J. 43(4), 853–863 (2005)

    Article  Google Scholar 

  47. 47.

    Mohan, S., Maiti, D.: Structural optimization of rotating disk using response surface equation and genetic algorithm. Int. J. Comput. Methods Eng. Sci. Mech. 14 (2), 124–132 (2013)

    Article  Google Scholar 

  48. 48.

    Neal, R.: Monte Carlo implementation of Gaussian process models for Bayesian regression and classification. arXiv:physics/9701026 (1997)

  49. 49.

    Park, J.S.: Optimal Latin-hypercube designs for computer experiments. Journal of Statistical Planning and Inference 39(1), 95–111 (1994)

    MathSciNet  Article  MATH  Google Scholar 

  50. 50.

    Park, S., Choi, S.: Hierarchical Gaussian process regression ACML, pp 95–110 (2010)

  51. 51.

    Pepelyshev, A.: The role of the nugget term in the Gaussian process method mODa 9–Advances in Model-Oriented Design and Analysis, pp 149–156. Springer (2010)

  52. 52.

    Qian, Z., Seepersad, C., Joseph, V., Allen, J., Wu, C.: Building surrogate models based on detailed and approximate simulations. J. Mech. Des. 128(4), 668–677 (2006)

    Article  Google Scholar 

  53. 53.

    Quiñonero-Candela, J., Rasmussen, C.: A unifying view of sparse approximate Gaussian process regression. J. Mach. Learn. Res. 6, 1939–1959 (2005)

    MathSciNet  MATH  Google Scholar 

  54. 54.

    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. The MIT Press (2006)

  55. 55.

    Shaby, B., Ruppert, D.: Tapered covariance: Bayesian estimation and asymptotics. J. Comput. Graph. Stat. 21(2), 433–452 (2012)

    MathSciNet  Article  Google Scholar 

  56. 56.

    Shi, J., Murray-Smith, R., Titterington, D.: Hierarchical Gaussian process mixtures for regression. Stat. Comput. 15(1), 31–41 (2005)

    MathSciNet  Article  Google Scholar 

  57. 57.

    Sterling, G., Prikhodko, P., Burnaev, E., Belyaev, M., Grihon, S.: On approximation of reserve factors dependency on loads for composite stiffened panels. Adv. Mater. Res. 1016, 85–89 (2014)

    Article  Google Scholar 

  58. 58.

    Sun, G., Li, G., Stone, M., Li, Q.: A two-stage multi-fidelity optimization procedure for honeycomb-type cellular materials. Comput. Mater. Sci. 49(3), 500–511 (2010)

    Article  Google Scholar 

  59. 59.

    Sun, S., Zhao, J., Zhu, J.: A review of nyström methods for large-scale machine learning. Information Fusion 26, 36–48 (2015)

    Article  Google Scholar 

  60. 60.

    Titsias, M.: Variational learning of inducing variables in sparse Gaussian processes International Conference on Artificial Intelligence and Statistics, pp 567–574 (2009)

  61. 61.

    van der Vaart, A., van Zanten, J.: Rates of contraction of posterior distributions based on gaussian process priors. Ann. Stat., 1435–1463 (2008)

  62. 62.

    Velten, K.: Mathematical modeling and simulation: introduction for scientists and engineers. Wiley (2009)

  63. 63.

    Xu, W., Tran, T., Srivastava, R., Journel, A.: Integrating seismic data in reservoir modeling: the collocated cokriging alternative SPE Annual Technical Conference and Exhibition. Society of Petroleum Engineers (1992)

  64. 64.

    Zahir, M., Gao, Z.: Variable fidelity surrogate assisted optimization using a suite of low fidelity solvers. Open Journal of Optimization 1(1), 0–8 (2012)

    Google Scholar 

  65. 65.

    Zaitsev, A., Burnaev, E.V., Spokoiny, V.G.: Properties of the posterior distribution of a regression model based on Gaussian random fields. Autom. Remote. Control. 74(10), 1645–1655 (2013)

    MathSciNet  Article  MATH  Google Scholar 

  66. 66.

    Zaytsev, A.: Reliable surrogate modeling of engineering data with more than two levels of fidelity Proceedings of 7th International IEEE Conference on Mechanical and Aerospace Engineering (2016)

  67. 67.

    Zaytsev, A., Burnaev, E.: Minimax Approach to Variable Fidelity Data Interpolation Proceedings of AISTATS 2017, Fort Lauderdale, Florida, USA. arXiv:1610.06731 (2017)

  68. 68.

    Zaytsev, A., Burnaev, E., Spokoiny, V.: Properties of the Bayesian parameter estimation of a regression based on Gaussian processes. J. Math. Sci. 203(6), 789–798 (2014)

    MathSciNet  Article  MATH  Google Scholar 

  69. 69.

    Zhang, H., Cai, W., et al.: When doesn’t cokriging outperform kriging? Stat. Sci. 30(2), 176–180 (2015)

    MathSciNet  Article  MATH  Google Scholar 

Download references

Acknowledgements

We thank Dmitry Khominich from DATADVANCE llc for making the solvers for the rotating disk problem available.

Author information

Affiliations

Authors

Corresponding author

Correspondence to A. Zaytsev.

Additional information

The research was supported by the the Russian Science Foundation grant (project 14-50-00150).

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Zaytsev, A., Burnaev, E. Large scale variable fidelity surrogate modeling. Ann Math Artif Intell 81, 167–186 (2017). https://doi.org/10.1007/s10472-017-9545-y

Download citation

Keywords

  • Variable fidelity data
  • Gaussian process regression
  • Nyström approximation
  • Cokriging

Mathematics Subject Classification (2010)

  • 60G15
  • 65F30
  • 62G08