Advertisement

Multi-response Approach to Improving Identifiability in Model Calibration

  • Zhen Jiang
  • Paul D. Arendt
  • Daniel W. Apley
  • Wei ChenEmail author
Reference work entry

Abstract

In physics-based engineering modeling, two primary sources of model uncertainty that account for the differences between computer models and physical experiments are parameter uncertainty and model discrepancy. One of the main challenges in model updating results from the difficulty in distinguishing between the effects of calibration parameters versus model discrepancy. In this chapter, this identifiability problem is illustrated with several examples that explain the mechanisms behind it and that attempt to shed light on when a system may or may not be identifiable. For situations in which identifiability cannot be achieved using only a single response, an approach is developed to improve identifiability by using multiple responses that share a mutual dependence on the calibration parameters. Furthermore, prior to conducting physical experiments but after conducting computer simulations, in order to address the issue of how to select the most appropriate set of responses to measure experimentally to best enhance identifiability, a preposterior analysis approach is presented to predict the degree of identifiability that will result from using different sets of responses to measure experimentally. To handle the computational challenges of the preposterior analysis, we also present a surrogate preposterior analysis based on the Fisher information of the calibration parameters.

Keywords

Parameter uncertainty Model discrepancy Experimental uncertainty Calibration Bias correction (Non)identifiability Identifiability Model uncertainty quantification Calibration parameters Discrepancy function Gaussian process Modular Bayesian approach Hyperparameters Simply supported beam Non-informative prior Multi-response Gaussian process Multi-response modular Bayesian approach Spatial correlation Non-spatial covariance Preposterior covariance Preposterior analysis Fixed-θ preposterior analysis Surrogate preposterior analysis Observed Fisher information 

References

  1. 1.
    Kennedy, M.C., O’Hagan, A.: Bayesian calibration of computer models. J. R. Stat. Soc. Ser. B 63(3), 425–464 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Higdon, D., Kennedy, M.C., Cavendish, J., Cafeo, J., Ryne, R.: Combining field data and computer simulations for calibration and prediction. SIAM J. Sci. Comput. 26(2), 448–466 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Reese, C.S., Wilson, A.G., Hamada, M., Martz, H.F.: Integrated analysis of computer and physical experiments. Technometrics 46(2), 153–164 (2004)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Bayarri, M.J., Berger, J.O., Paulo, R., Sacks, J., Cafeo, J.A., Cavendish, J., Lin, C.H., Tu, J.: A framework for validation of computer models. Technometrics 49(2), 138–154 (2007)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Higdon, D., Gattiker, J., Williams, B., Rightley, M.: Computer model calibration using high-dimensional output. J. Am. Stat. Assoc. 103(482), 570–583 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Chen, W., Xiong, Y., Tsui, K.L., Wang, S.: A design-driven validation approach using bayesian prediction models. J. Mech. Des. 130(2), 021101 (2008)CrossRefGoogle Scholar
  7. 7.
    Qian, P.Z.G., Wu, C.F.J.: Bayesian hierarchical modeling for integrating low-accuracy and high-accuracy experiments. Technometrics 50(2), 192–204 (2008)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Wang, S., Tsui, K.L., Chen, W.: Bayesian validation of computer models. Technometrics 51(4), 439–451 (2009)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Drignei, D.: A kriging approach to the analysis of climate model experiments. J. Agric. Biol. Environ. Stat. 14(1), 99–112 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Akkaram, S., Agarwal, H., Kale, A., Wang, L.: Meta modeling techniques and optimal design of experiments for transient inverse modeling applications. Paper presented at the ASME International Design Engineering Technical Conference, Montreal (2010)CrossRefGoogle Scholar
  11. 11.
    Huan, X., Marzouk, Y.M.: Simulation-based optimal Bayesian experimental design for nonlinear systems. J. Comput. Phys. 232(1), 288–317 (2013)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Loeppky, J., Bingham, D., Welch, W.: Computer model calibration or tuning in practice. Technical Report, University of British Columbia, Vancouver, p. 20 (2006)Google Scholar
  13. 13.
    Han, G., Santner, T.J., Rawlinson, J.J.: Simultaneous determination of tuning and calibration parameters for computer experiments. Technometrics 51(4), 464–474 (2009)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Arendt, P.D., Apley, D.W., Chen, W.: Quantification of model uncertainty: Calibration, model discrepancy, and identifiability. J. Mech. Des. 134(10) (2012)Google Scholar
  15. 15.
    Arendt, P.D., Apley, D.W., Chen, W., Lamb, D., Gorsich, D.: Improving identifiability in model calibration using multiple responses. J. Mech. Des. 134(10) (2012)Google Scholar
  16. 16.
    Ranjan, P., Lu, W., Bingham, D., Reese, S., Williams, B., Chou, C., Doss, F., Grosskopf, M., Holloway, J.: Follow-up experimental designs for computer models and physical processes. J. Stat. Theory Pract. 5(1), 119–136 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Williams, B.J., Loeppky, J.L., Moore, L.M., Macklem, M.S.: Batch sequential design to achieve predictive maturity with calibrated computer models. Reliab. Eng. Syst. Saf. 96(9), 1208–1219 (2011)CrossRefGoogle Scholar
  18. 18.
    Tuo, R., Wu, C.F.J., Vu, D.: Surrogate modeling of computer experiments with different mesh densities. Technometrics 56(3), 372–380 (2014)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Maheshwari, A.K., Pathak, K.K., Ramakrishnan, N., Narayan, S.P.: Modified Johnson-Cook material flow model for hot deformation processing. J. Mater. Sci. 45(4), 859–864 (2010)CrossRefGoogle Scholar
  20. 20.
    Xiong, Y., Chen, W., Tsui, K.L., Apley, D.W.: A better understanding of model updating strategies in validating engineering models. Comput. Methods Appl. Mech. Eng. 198(15–16), 1327–1337 (2009)CrossRefzbMATHGoogle Scholar
  21. 21.
    Liu, F., Bayarri, M.J., Berger, J.O., Paulo, R., Sacks, J.: A Bayesian analysis of the thermal challenge problem. Comput. Methods Appl. Mech. Eng. 197(29–32), 2457–2466 (2008)CrossRefzbMATHGoogle Scholar
  22. 22.
    Arendt, P., Apley, D.W., Chen, W.: Updating predictive models: calibration, bias correction, and identifiability. Paper presented at the ASME 2010 International Design Engineering Technical Conferences, Montreal (2010)Google Scholar
  23. 23.
    Chakrabarty, J.: Theory of Plasticity, 3rd edn. Elsevier/Butterworth-Heinemann, Burlington (2006)zbMATHGoogle Scholar
  24. 24.
    Liu, F., Bayarri, M.J., Berger, J.O.: Modularization in Bayesian analysis, with emphasis on analysis of computer models. Bayesian Anal. 4(1), 119–150 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    Joseph, V., Melkote, S.: Statistical adjustments to engineering models. J. Qual. Technol. 41(4), 362–375 (2009)Google Scholar
  26. 26.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. Adaptive Computation and Machine Learning. MIT Press, Cambridge (2006)zbMATHGoogle Scholar
  27. 27.
    Qian, P.Z.G., Wu, H.Q., Wu, C.F.J.: Gaussian process models for computer experiments with qualitative and quantitative factors. Technometrics 50(3), 383–396 (2008)MathSciNetCrossRefGoogle Scholar
  28. 28.
    McMillan, N.J., Sacks, J., Welch, W.J., Gao, F.: Analysis of protein activity data by gaussian stochastic process models. J. Biopharm. Stat. 9(1), 145–160 (1999)CrossRefzbMATHGoogle Scholar
  29. 29.
    Cressie, N.: Statistics for Spatial Data. Wiley Series in Probability and Statistics. Wiley, New York (1993)zbMATHGoogle Scholar
  30. 30.
    Ver Hoef, J., Cressie, N.: Multivariable spatial prediction. Math. Geol. 25(2), 219–240 (1993)MathSciNetCrossRefzbMATHGoogle Scholar
  31. 31.
    Conti, S., Gosling, J.P., Oakley, J.E., O’Hagan, A.: Gaussian process emulation of dynamic computer codes. Biometrika 96(3), 663–676 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  32. 32.
    Conti, S., O’Hagan, A.: Bayesian emulation of complex multi-output and dynamic computer models. J. Stat. Plan. Inference 140(3), 640–651 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  33. 33.
    Williams, B., Higdon, D., Gattiker, J., Moore, L.M., McKay, M.D., Keller-McNulty, S.: Combining experimental data and computer simulations, with an application to flyer plate experiments. Bayesian Analysis 1(4), 765–792 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  34. 34.
    Bayarri, M.J., Berger, J.O., Cafeo, J., Garcia-Donato, G., Liu, F., Palomo, J., Parthasarathy, R.J., Paulo, R., Sacks, J., Walsh, D.: Computer model validation with functional output. Ann. Stat. 35(5), 1874–1906 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  35. 35.
    McFarland, J., Mahadevan, S., Romero, V., Swiler, L.: Calibration and uncertainty analysis for computer simulations with multivariate output. AIAA J. 46(5), 1253–1265 (2008)CrossRefGoogle Scholar
  36. 36.
    Drignei, D.: A kriging approach to the analysis of climate model experiments. J. Agric. Biol. Environ. Stat. 14(1), 99–114 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  37. 37.
    Kennedy, M.C., Anderson, C.W., Conti, S., O’Hagan, A.: Case studies in gaussian process modelling of computer codes. Reliab. Eng. Syst. Saf. 91(10–11), 1301–1309 (2006)CrossRefGoogle Scholar
  38. 38.
    Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Stat. Sci. 4(4), 409–423 (1989)MathSciNetCrossRefzbMATHGoogle Scholar
  39. 39.
    Rasmussen, C.E.: Evaluation of Gaussian Processes and Other Methods for Non-linear Regression. University of Toronto (1996)Google Scholar
  40. 40.
    Lancaster, T.: An Introduction to Modern Bayesian Econometrics. Blackwell, Malden (2004)zbMATHGoogle Scholar
  41. 41.
    Arendt, P.D., Apley, D.W., Chen, W.: A preposterior analysis to predict identifiability in experimental calibration of computer models. IIE Trans. 48(1), 75–88 (2016)CrossRefGoogle Scholar
  42. 42.
    Jiang, Z., Chen, W., Apley, D.W.: Preposterior analysis to select experimental responses for improving identifiability in model uncertainty quantification. Paper presented at the ASME 2013 International Design Engineering Technical Conferences & Computers and Information in Engineering Conference, Portland (2013)Google Scholar
  43. 43.
    Jiang, Z., Apley, D.W., Chen, W.: Surrogate preposterior analyses for predicting and enhancing identifiability in model calibration. Int. J. Uncertain. Quantif. 5(4), 341–359 (2015)CrossRefGoogle Scholar
  44. 44.
    Berger, J.O.: Statistical Decision Theory and Bayesian Analysis. Springer Series in Statistics. Springer, New York (1985)CrossRefzbMATHGoogle Scholar
  45. 45.
    Carlin, B.P., Louis, T.A.: Empirical bayes: Past, present and future. J. Am. Stat. Assoc. 95(452), 1286–1289 (2000)MathSciNetCrossRefzbMATHGoogle Scholar
  46. 46.
    Wu, C., Hamada, M.: Experiments: Planning, Analysis, and Optimization. Wiley, New York (2009)zbMATHGoogle Scholar
  47. 47.
    Montgomery, D.C.: Design and Analysis of Experiments, 7th edn. Wiley, Hoboken (2008)Google Scholar
  48. 48.
    Abramowitz, M., Stegun, I.A.: Handbook of Mathematical Functions. Dover, New York (1972)zbMATHGoogle Scholar
  49. 49.
    Beyer, W.H.: CRC Standard Mathematical Tables, 28 edn. CRC, Boca Raton (1987)zbMATHGoogle Scholar
  50. 50.
    Robert, C., Casella, G.: Monte Carlo Statistical Methods. Springer, New York (2004)CrossRefzbMATHGoogle Scholar
  51. 51.
    Smith, A., Gelfand, A.: Bayesian statistics without tears: a sampling-resampling perspective. Am. Stat. 46(2), 84–88 (1992)MathSciNetGoogle Scholar
  52. 52.
    Johnson, R., Wichern, D.: Applied Multivariate Statistical Analysis, 6th edn. Prentice Hall, Upper Saddle River (2007)zbMATHGoogle Scholar
  53. 53.
    Kennedy, M.C., O’Hagan, A.: Supplementary Details on Bayesian Calibration of Computer Models, pp. 1–13. University of Sheffield, Sheffield (2000)Google Scholar
  54. 54.
    Billingsley, P.: Probability and Measure, Anniversary Edition. John Wiley & Sons, Inc., Hoboken (2011)zbMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  • Zhen Jiang
    • 1
  • Paul D. Arendt
    • 2
  • Daniel W. Apley
    • 3
  • Wei Chen
    • 1
    Email author
  1. 1.Department of Mechanical EngineeringNorthwestern UniversityEvanstonUSA
  2. 2.CNA Financial CorporationChicagoUSA
  3. 3.Department of Industrial Engineering and Management SciencesNorthwestern UniversityEvanstonUSA

Personalised recommendations