Quantification of Aleatoric and Epistemic Uncertainty in Computational Models of Complex Systems
For complex engineering systems, testing-based assessment is increasingly sought to be replaced by simulations using detailed computational models. This is due to a lack of experimental data and/or resources to conduct these experiments at the system level. Components, which are part of the system, are usually cheaper to build and test relative to the system itself. The availability of component data coupled with the lack of system data and the complexity of the system being model leads to a need to build models in a building-block or hierarchical manner. This approach takes advantage of data at the component level by guiding the development of each component model. These models are then coupled to form the system model. Quantification of uncertainty in a system response is required to establish the confidence in representing the actual system behavior. To be accurate, this quantification needs to include both aleatoric uncertainty (due to natural variability) and epistemic uncertainty (due to lack of or incomplete knowledge). This paper proposes a framework based on Bayes networks that uses the available data at multiple levels of complexity (i.e. components, subsystem, etc) and allows quantification and propagation of both types of uncertainty in a system model prediction. A method to incorporate epistemic uncertainty given in terms of intervals on a model parameter is presented and a numerical example demonstrating the approach is shown.
KeywordsFoam Sine Cyan
Unable to display preview. Download preview PDF.
- 1.Rebba, R., Model Validation And Design Under Uncertainty, Vanderbilt University, Nashville, TN, 2005.Google Scholar
- 2.Urbina, A. and Mahadevan, S., “Uncertainty Quantification in Hierarchical Development of Computational Models”, Proceedings of the 50th AIAA Structures, Structural Dynamics, and Materials Conference, 2009.Google Scholar
- 3.Oberkampf, W.L., DeLand, S.M., Rutherford, B.M., Diegert, K.V., and Alvin, K.F.,“Estimation of total uncertainty in modeling and simulation”, SAND 2000–0824, Sandia National Laboratories, Albuquerque, NM, 2000.Google Scholar
- 5.Reliability Engineering and System Safety (RESS), Alternative Representation of Epistemic Uncertainty, J.C. Helton and W.L. Oberkampf, guest editors, Vol. 85. Nos. 1–3, July-September, 2004.Google Scholar
- 7.Parry, G.W., and Winter, P.W., “Characterization and Evaluation of Uncertainty in Probabilistic Risk Analysis”, Nuclear Safety, Vol. 22, No. 1, pp. 28–42, 1981.Google Scholar
- 8.Helton, J.C., Johnson, J.D., Oberkampf, W.L. and Storlie, C.B., “A Sampling-Based Computational Strategy for the Representation of Epistemic Uncertainty in Model Predictions with Evidence Theory”, SAND2006-5557, Sandia National Laboratories, Albuquerque, NM, 2006.Google Scholar
- 9.Urbina, A., Paez, T.L., Gregory, D., Resor, B., Hinnerichs, T.D. and O’Gorman, C.C, “Validation of a Combined Non-Linear Joint and Viscoelastic Encapsulating Foam”, Proceedings of the 2006 Society for Experimental Mechanics, St. Louis, MO, 2006.Google Scholar
- 10.Ferson, S., Kreinovich, V., Hajagos, J., Oberkampf, W., and Ginzburg, L., Experimental Uncertainty Estimation and Statistics for Data Having Interval Uncertainty, SAND2007-0939, Sandia National Laboratories, Albuquerque, NM, 2007.Google Scholar
- 11.Ferson, S., R.B. Nelsen, J. Hajagos, D.J. Berleant, J. Zhang, W.T. Tucker, L.R. Ginzburg and W.L. Oberkampf, Dependence in Probabilistic Modeling, Dempster-Shafer Theory, and Probability Bounds Analysis. SAND2004-3072, Sandia National Laboratories, Albuquerque, New Mexico, 2004.Google Scholar
- 12.Osegueda, R., V. Kreinovich, L. Potluri, R. Aló, Non-destructive testing of aerospace structures: granularity and data mining approach. Pages 685–689 in Proceedings of FUZZ-IEEE 2002, Vol. 1, Honolulu, Hawaii, 2002.Google Scholar
- 13.McDonald, M., Zaman, K., Rangavajhala, S., Mahadevan, S., “A probabilistic approach for representation of interval uncertainty”, under review, Reliability Engineering and Systems Safety, 2009.Google Scholar
- 14.Jensen, Finn V., Bayesian Networks and decision graphs, Springer-Verlag, New York, 2001.Google Scholar
- 15.Gilks, G. R., Richardson, S., and Spiegelhalter, D. J., Markov Chain Monte Carlo in Practice, Interdisciplinary Statistics, Chapman & Hall/CRC, London, 1996.Google Scholar
- 16.Spiegelhalter, D. J., Thomas, A., Best, N. G. and Lunn, D., WinBUGS User Manual Version 1.4. Cambridge, U.K.: MRC Biostatistics Unit, [Online], Available: http://www.mrc-bsu.cam.ac.uk/bugs2003.
- 17.McFarland, J.M., Uncertainty analysis for computer simulations through validation and calibration, Vanderbilt University, Nashville, TN, 2008.Google Scholar
- 18.Smallwood, D., Gregory, D., Coleman, R., “Damping Investigations of a Simplified Frictional Shear Joint,” Proceedings of the 71st Shock and Vibration Symposium, SAVIAC, The Shock and Vibration Information Analysis Center, 2000.Google Scholar
- 21.DeBrota, David J., Swain, James J., Roberts, Stephen D., Venkataraman, Sekhar., “Input modeling with the Johnson System of distributions,” Proceedings of the 1988 Winter Simulation Conference, 1998.Google Scholar
- 22.Cox, D.R. and D. Oakes, Analysis of Survival Data, Chapman & Hall, London, 1984.Google Scholar
- 23.The Mathworks, Inc, “Matlab: On-line documentation”, Natick, Massachusetts, 2009.Google Scholar
- 24.Silverman, B. W., Density Estimation for Statistics and Data Analysis, Chapman and Hall, London, 1986.Google Scholar