Towards the Diagnosis and Simulation of Discrepancies in Dynamical Models

  • P. L. GreenEmail author
Conference paper
Part of the Conference Proceedings of the Society for Experimental Mechanics Series book series (CPSEMS)


Models are frequently used to make predictions in regions where experimental testing is difficult. This often involves extrapolating to regions far from where the model was validated. In this paper an example is shown where, despite using a Bayesian analysis to quantify parameter estimation uncertainties, such an extrapolation performs poorly. It is then demonstrated that, in the presence of measurement noise, treating a system’s parameters as being time-variant (even if this is not believed to be true) can reveal fundamental flaws in a model. Finally, existing methods which can be used to quantify model error—the inevitable discrepancies that arise because of approximations made during model development—are extended towards dynamical systems.


Verification and validation Particle filter Gaussian process Nonlinear dynamics System identification 


  1. 1.
    Hemez, F., Atamturktur, S., Unal, C.: Defining predictive maturity for validated numerical simulations. Comput. Struct. 88 (7), 497–505 (2010)CrossRefGoogle Scholar
  2. 2.
    Atamturktur, S., Hemez, F., Williams, B., Tome, C., Unal, C.: A forecasting metric for predictive modeling. Comput. Struct. 89 (23), 2377–2387 (2011)CrossRefGoogle Scholar
  3. 3.
    Jaynes, E.T.: Probability Theory: The Logic of Science. Cambridge University Press, Cambridge (2003)CrossRefzbMATHGoogle Scholar
  4. 4.
    Ching, J., Chen, Y.C.: Transitional Markov chain Monte Carlo method for Bayesian model updating, model class selection, and model averaging. J. Eng. Mech. 133 (7), 816–832 (2007)CrossRefGoogle Scholar
  5. 5.
    Arulampalam, M.S., Maskell, S., Gordon, N., Clapp, T.: A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. IEEE Trans. Signal Process. 50 (2), 174–188 (2002)CrossRefGoogle Scholar
  6. 6.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)zbMATHGoogle Scholar
  7. 7.
    MacKay, D.J.C.: Information Theory, Inference and Learning Algorithms. Cambridge University Press, Cambridge (2003)zbMATHGoogle Scholar
  8. 8.
    Ching, J., Beck, J.L., Porter, K.A.: Bayesian state and parameter estimation of uncertain dynamical systems. Probab. Eng. Mech. 21 (1), 81–96 (2006)CrossRefGoogle Scholar
  9. 9.
    Kennedy, M.C., O’Hagan, A.: Bayesian calibration of computer models. J. R. Stat. Soc. Ser. B Stat. Methodol. 63 (3), 425–464 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Higdon, D., Gattiker, J., Williams, B., Rightley, M.: Computer model calibration using high-dimensional output. J. Am. Stat. Assoc. 103 (482), 570–583 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Worden, K., Manson, G., Cross, E.J.: On Gaussian process NARX models and their higher-order frequency response functions. In: Solving Computationally Expensive Engineering Problems, pp. 315–335. Springer, Cham (2014)Google Scholar

Copyright information

© The Society for Experimental Mechanics, Inc. 2016

Authors and Affiliations

  1. 1.School of Engineering, Centre for Engineering SustainabilityInstitute for Risk and Uncertainty, University of LiverpoolLiverpoolUK

Personalised recommendations