Research in Engineering Design

, Volume 18, Issue 3, pp 111–128 | Cite as

Validating behavioral models for reuse

Original Paper

Abstract

When using a model to predict the behavior of a physical system of interest, engineers must be confident that, under the conditions of interest, the model is an adequate representation of the system. The process of building this confidence is called model validation. It requires that engineers have knowledge about the system and conditions of interest, properties of the model and their own tolerance for uncertainty in the predictions. To reduce time and costs, engineers often reuse preexisting models that other engineers have developed. However, if the user lacks critical parts of this knowledge, model validation can be as time consuming and costly as developing a similar model from scratch. In this article, we describe a general process for performing model validation for reused behavioral models that overcomes this problem by relying on the formalization and exchange of knowledge. We identify the critical elements of this knowledge, discuss how to represent it and demonstrate the overall process on a simple engineering example.

Keywords

Model validation Model reuse Model characterization Model context 

Notes

Acknowledgments

The authors thank Jason Aughenbaugh, Jay Ling, Steven Rekuc, Morgan Bruns for their contributions to this work. This work is supported by the G.W.W. School of Mechanical Engineering at the Georgia Institute of Technology and NASA Ames Research Center under cooperative agreement NNA04CK40A.

References

  1. Akman V, Surav M (1996) Steps toward formalizing context. AI Mag 17:55–72Google Scholar
  2. Akman V, Surav M (1997) The use of situation theory in context modeling. Comput Intell 13:427–438CrossRefGoogle Scholar
  3. Aughenbaugh JM, Paredis CJJ (2006) The value of using imprecise probabilities in engineering design. J Mech Des 128:969–979CrossRefGoogle Scholar
  4. Balci O, Ormsby WF (2000) Well-defined intended uses: an explicit requirement for accreditation of modeling and simulation applications. In: Proceedings of the 2000 Winter Simulation Conference. Institute of Electrical and Electronics Engineers, San FranciscoGoogle Scholar
  5. Banks J, Gerstein D, Searles SP (1987) Modeling processes, validation and verification of complex systems: a survey. In: Orlando FL, Balci O (eds) Proceedings of the conference on methodology and validation. Society for Computer Simulation, pp 13–18Google Scholar
  6. Barlas Y, Carpenter S (1990) Philosophical roots of model validation: two paradigms. Syst Dyn Rev 6:148–166CrossRefGoogle Scholar
  7. Basili VR, Briand LC, Melo W (1996) How reuse influences productivity in object-oriented systems. Commun ACM 39:104–116CrossRefGoogle Scholar
  8. Biggerstaff TJ, Richter C (1989) Reusability framework, assessment, and directions. In: Biggerstaff TJ, Perlis AJ (eds) Software reusability: volume 1: concepts and models. ACM Press, New YorkGoogle Scholar
  9. Clemen RT (1996) Making hard decisions: an introduction to decision analysis, 2nd edn. Duxbury Press, Pacific GroveGoogle Scholar
  10. Der Kiureghian A (1989) Measures of structural safety under imperfect states of knowledge. J Struct Eng 115:1119–1139Google Scholar
  11. Falkenhainer B, Forbus KD (1991) Compositional modeling: finding the right model for the job. Artif Intell 51:95–143CrossRefGoogle Scholar
  12. Ferson S, Donald S (1998) Probability bounds analysis. In: Proceedings of the international conference on probabilistic safety assessment and management (PSAM4), 13–18 September. Springer, New York, pp 1203–1208Google Scholar
  13. Ferson S, Ginzburg L (1996) Different methods are needed to propagate ignorance and variability. Reliabil Eng Syst Safety 54:133–144CrossRefGoogle Scholar
  14. Grosse IR, Milton-Benoit JM, Wileden JC (2005) Ontologies for supporting engineering analysis models. Artif Intell Eng Des Anal Manuf 19:1–18Google Scholar
  15. Guha RV, Lenat DB (1992) Language, representation and contexts. J Inf Process 15:340–349Google Scholar
  16. Guha RV, McCarthy J (2003) Varieties of contexts. In: Blackburn P, Ghidini C, Turner RM, Giunchiglia F (eds) Proceedings of the modeling and using context, 4th international and interdisciplinary conference, CONTEXT 2003. Springer, Stanford, pp 164–177Google Scholar
  17. Herskovitz PJ (1991) A theoretical framework for simulation validation: Popper’s falsificationism. Int J Model Simul 11:56–58Google Scholar
  18. HLA Working Group (2000) IEEE standard for modeling and simulation (M&S) high level architecture (HLA)—framework and rules. Standard. IEEE, New York. IEEE STD 1516–2000Google Scholar
  19. Hume D (1965) A treatise of human nature. Clarendon Press, OxfordGoogle Scholar
  20. Kleindorfer GB, Ganeshan R (1993) The philosophy of science and validation in simulation. In: Evans GW, Mollaghasemi M, Russell EC, Biles WE (eds) Proceedings of the winter simulation conference. Institute of Electrical and Electronic Engineers, pp 50–57Google Scholar
  21. Krueger CW (1992) Software reuse. ACM Comput Surv 24:131–183CrossRefGoogle Scholar
  22. Lawrence DB (1999) The economic value of information. Springer, HeidelbergGoogle Scholar
  23. Ling J, Aughenbaugh JM, Paredis CJJ (2006) Managing the collection of information under uncertainty using information economics. J Mech Des 128:980–990CrossRefGoogle Scholar
  24. Malak RJ (2005) A framework for validating reusable behavioral models in engineering design. Masters thesis, Georgia Institute of Technology, AtlantaGoogle Scholar
  25. McCarthy J (1993) Notes on formalizing context. In: Proceedings of the IJCAI-93: proceedings of the thirteenth international joint conference on artificial intelligence, Chambery, France, 28 August–3 September, Morgan Kaufmann, San Francisco, pp 555–560Google Scholar
  26. Mocko G, Malak RJ, Paredis CJJ et al (2004) A knowledge repository for behavioral models in engineering design. In: Proceedings of the ASME computers and information in engineering conference, Salt Lake City. Paper No. CIE2004-57746Google Scholar
  27. Morse KL, Lightner M, Little R et al (2006) Enabling simulation interoperability. IEEE Comput 39(1):115–117Google Scholar
  28. Naylor TH, Finger JM (1967) Verification of computer simulation models. Manag Sci 14:B92–B101CrossRefGoogle Scholar
  29. Nikolaidis E (2005) Types of uncertainty in design decision making. In: Nikolaidis E, Ghiocel DM, Singhal S (eds) Engineering design reliability handbook. CRC Press, New YorkGoogle Scholar
  30. Oberkampf WL, Trucano TG (2002) Verification and validation in computational fluid dynamics. Prog Aerosp Sci 38:209–272CrossRefGoogle Scholar
  31. Oberkampf WL, DeLand SM, Rutherford BM et al (2000) Estimation of total uncertainty in modeling and simulation. Sandia National Laboratories, Albuquerque. (SAND2000-0824)Google Scholar
  32. Pace D (2004) The referent study final report. US Defense Modeling and Simulation Office, Alexandria. (Report No. JWR-04-010)Google Scholar
  33. Parry GW (1996) The characterization of uncertainty in probabilistic risk assessment of complex systems. Reliabil Eng Syst Safety 54:119–126CrossRefGoogle Scholar
  34. Popper KR (1972) The logic of scientific discovery, 6th edn. Hutchinson, LondonGoogle Scholar
  35. Rae A, Robert P, Hausen H-L (1995) Software evaluation and certification: principles, practice and legal liability. McGraw-Hill, LondonGoogle Scholar
  36. Rekuc SJ, Aughenbaugh JM, Bruns M et al (2006) Eliminating design alternatives based on imprecise information. In: Proceedings of the society of automotive engineering world congress, Detroit, MI, 3–7 April. (Paper No. 2006-01-0272)Google Scholar
  37. Robinson S, Nance RE, Paul RJ, et al. (2004) Simulation model reuse: definitions, benefits, and obstacles. Simul Model Pract Theor 12:479–494CrossRefGoogle Scholar
  38. Sargent RG (1985) An expository on verification and validation of simulation models. In: Gantz D, Blais G, Solomon S (eds) Proceedings of the 1985 winter simulation conference. Institute of Electrical and Electronics Engineers, San Francisco, pp 15–22Google Scholar
  39. Schlesinger S, Crosbie RE, Innis GS et al (1979) Terminology for model credibility. Simulation 32:103–104Google Scholar
  40. Schlosser J, Paredis CJJ (2007) Managing multiple sources of epistemic uncertainty in engineering decision making. In: Proceedings of the society of automotive engineering world congress, Detroit, MI, 16–19 April. (Paper No. 2007-01-1481)Google Scholar
  41. Shigley JE, Mischke CR (2001) Mechanical engineering design, 6th edn. McGraw-Hill, New YorkGoogle Scholar
  42. Simon HA (1996) The sciences of the artificial, 3rd edn. MIT Press, CambridgeGoogle Scholar
  43. Spiegel M, Reynolds PF Jr, Brogan DC (2005) A case study of model context for simulation composibility and reusability. In: Kuhl ME, Steiger NM, Armstrong FB, Joines JA (eds) Proceedings of the 2005 winter simulation conference. Institute of Electrical and Electronic Engineers, San FranciscoGoogle Scholar
  44. Szykman S, Sriram R, Bochenek C et al (1998) The NIST design repository project. In: Advances in soft computing—engineering design and manufacturing. Springer, LondonGoogle Scholar
  45. United States Department of Defense (2003) DoD modeling and simulation (M&S) verification, validation and accreditation (VV&A). DoD instruction number 5000.61, GPO, Washington, DC, 24 February 2005. http://www.dtic.mil/whs/directives/corres/pdf/i500061_051303/i500061p.pdf
  46. United States Nuclear Regulatory Commission (1983) PRA procedures guide. Washington, DCGoogle Scholar
  47. Walley J (1991) Statistical reasoning with imprecise probabilities. Chapman and Hall, New YorkMATHGoogle Scholar
  48. Yilmaz L (2004) On the need for contextualized introspective models to improve reuse and composibility of defense simulations. J Def Model Simul 1:141–151Google Scholar

Copyright information

© Springer-Verlag London Limited 2007

Authors and Affiliations

  • Richard J. MalakJr
    • 1
  • Christiaan J. J. Paredis
    • 1
  1. 1.Systems Realization Laboratory, G.W. Woodruff School of Mechanical EngineeringGeorgia Institute of TechnologyAtlantaUSA

Personalised recommendations