Advertisement

Validation, Verification, and Uncertainty Quantification for Models with Intelligent Adversaries

  • Jing ZhangEmail author
  • Jun Zhuang
Reference work entry

Abstract

Model verification and validation (V&V) are essential before a model can be implemented in practice. Integrating model V&V into the process of model development can help reduce the risk of errors, enhance the accuracy of the model, and strengthen the confidence of the decision-maker in model results. Besides V&V, uncertainty quantification (UQ) techniques are used to verify and validate computational models. Modeling intelligent adversaries is different from and more difficult than modeling non-intelligent agents. However, modeling intelligent adversaries is critical to infrastructure protection and national security. Model V&V and UQ for intelligent adversaries present a big challenge. This chapter first reviews the concepts of model V&V and UQ in the literature and then discusses model V&V and UQ for intelligent adversaries. Some V&V techniques for modeling intelligent adversaries are provided which could be beneficial to model developers and decision-makers facing with intelligent adversaries.

Keywords

Decision making Intelligent adversaries Model validation and verification Validation techniques 

Notes

Acknowledgements

This research was partially supported by the United States Department of Homeland Security (DHS) through the National Center for Risk and Economic Analysis of Terrorism Events (CREATE) under award number 2010-ST-061-RE0001. This research was also partially supported by the United States National Science Foundation under award numbers 1200899 and 1334930. However, any opinions, findings, and conclusions or recommendations in this document are those of the authors and do not necessarily reflect views of the DHS, CREATE, or NSF. The authors assume responsibility for any errors.

References

  1. 1.
    AIAA: AIAA guide for the verification and validation of computational fluid dynamics simulation. AIAA-G-077-1998, Reston (1998)Google Scholar
  2. 2.
    Balci, O., Sargent, R.G.: A Methodology for cost-risk analysis in the statistical validation of simulation models. Commun. ACM. 24(4), 190–197 (1981)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Banks, D.: Adversarial Risk Analysis: Principles and Practice. Presentation on First Conference on Validating Models of Adversary Behaviors, Buffalo (2013)Google Scholar
  4. 4.
    Banks, J., Carson II J.S., Nelson, B.L.: Discrete-Event System Simulation, 2nd edn. Prentice Hall International, London, UK (1996)Google Scholar
  5. 5.
    Borgonovo, E.: Measuring uncertainty importance: investigation and comparison of alternative approaches. Risk Anal. 20(5), 1349–1361 (2006)CrossRefGoogle Scholar
  6. 6.
    Coleman, H.W., Steele, W.G.: Experimentation, Validation, and Uncertainty Analysis for Engineers. Wiley, Hoboken (2009)CrossRefGoogle Scholar
  7. 7.
    DoD: DoD directive No 5000.59: Modeling and Simulation (M&S) Management. Defense Modeling and Simulation Office, Office of the Director of Defense Research and Engineering (1994)Google Scholar
  8. 8.
    DoD: Verification, Validation, and Accreditation (VV&A) Recommended Practices Guide. Defense Modeling and Simulation Office, Office of the Director of Defense Research and Engineering (1996)Google Scholar
  9. 9.
    DoD: Special Topic on “Subject Matter Experts and Validation, Verification and Accreditation”, DoD Recommended Practices Guide (RPG) for Modeling and Simulation VV&A, Millennium Edition (2000)Google Scholar
  10. 10.
    DHS: Department of Homeland Security Bioterrorism Risk Assessment: A Call for Change. Available at http://www.nap.edu/catalog/12206.html (2006). Accessed in Nov 2015
  11. 11.
    Elovici, Y., Kandel, A., Last, M., Shapira, B. Zaafrany, O.: Using Data mining Techniques for Detecting Terror-Related Activities on the Web. Available at http://www.ise.bgu.ac.il/faculty/mlast/papers/JIW_Paper.pdf. Accessed in Nov 2015
  12. 12.
    Ezell, B.C., Bennett, S.P., Winterfeldt, D., Sokolowski, J, Collins, A.J.: Probabilistic risk analysis and terrorism risk. Risk Anal. 30(4), 575–589 (2010)CrossRefGoogle Scholar
  13. 13.
    Ferson, S., Oberkampf, W.: Validation of imprecise probability models. Int. J. Reliab. Saf. 3(1), 3–22 (2009)CrossRefGoogle Scholar
  14. 14.
    Garrick, B.J., Hall, J.E., Kilger, M., McDonald, J.C., O’Toole, T., Probst, P.S., Parker, E.R., Rosenthal, R., Trivelpiece, A.W., Arsdale, L.V., Zebroski, E.L.: Confronting the risks of terrorism: making the right decisions. Reliab. Eng. Syst. Saf. 86(2), 129–176 (2004)CrossRefGoogle Scholar
  15. 15.
    Gass, S.I.: Decision-aiding models: validation, assessment, and related issues for policy analysis. Oper. Res. 31(4), 603–631(1983)Google Scholar
  16. 16.
    Gruhl, J., Gruhl, H.: Methods and Examples of Model Validation-an Annotated Bibliography. MIT Energy Laboratory Working Paper MIT-EL 78-022WP (1978)Google Scholar
  17. 17.
    Guikema, S.: Modeling intelligent adversaries for terrorism risk assessment: some necessary conditions for adversary models. Risk Anal. 32(7), 1117–1121 (2012)CrossRefGoogle Scholar
  18. 18.
    Guikema, S., Reilly, A.: Perspectives on Validation of Terrorism Risk Analysis Models. Presentation on First Conference on Validating Models of Adversary Behaviors, Buffalo (2013)Google Scholar
  19. 19.
    Guo, L., Huang, S., Zhuang, J.: Modeling parking behavior under uncertainty: a static game theoretic versus a sequential neo-additive capacity modeling approach. Netw. Spat. Econ. 13(3), 327–350(2013)Google Scholar
  20. 20.
    Hausken, K., Zhuang, J.: The impact of disaster on the interaction between company and government. Eur. J. Oper. Res. 225(2), 363–376(2013)Google Scholar
  21. 21.
    Hemez, F.M., Doebling, S.W.: Model validation and uncertainty quantification. For publication in the proceeding of IMAC-XIX, the 19th International Model Analysis Conference, Kissimmee, 5–8 Feb 2001Google Scholar
  22. 22.
    Hills, R.G., Leslie, I.H.: Statistical validation of engineering and scientific models: validation experiments to application. Sandia Technical Report (SAND2003-0706) (2003)Google Scholar
  23. 23.
    Holt, C.A., Kydd, A., Razzolini, L., Sheremeta, R.: The Paradox of Misaligned Profiling: Theory and Experimental Evidence. Available at http://www.people.vcu.edu/~lrazzolini/Profiling.pdf (2014). Accessed in Nov 2015
  24. 24.
    IEEE: IEEE Standard Glossary of Software Engineering Terminology. IEEE Std 610.12-1990, New York (1991)Google Scholar
  25. 25.
    International Terrorism: Attributes of Terrorist Events (ITERATE). Available at http://library.duke.edu/data/collections/iterate. Accessed in Nov 2015
  26. 26.
    Jiang, X., Mahadevan, S.: Bayesian risk-based decision method for model validation under uncertainty. Reliab. Eng. Syst. Saf. 92(6), 707–718 (2007)CrossRefGoogle Scholar
  27. 27.
    John, R., Rosoff, H.: Validation of Proxy Random Utility Models for Adaptive Adversaries. Available at http://psam12.org/proceedings/paper/paper_437_1.pdf (2014). Accessed in November, 2015
  28. 28.
    Jose, V.R.R., Zhuang, J.: Technology Adoption, Accumulation, and Competition in Multi-period Attacker-Defender Games. Mil. Oper. Res. 18(2), 33–47 (2013)CrossRefGoogle Scholar
  29. 29.
    LaFree, G., Dugan, L.L.: Introducing the global terrorism database. Terror. Political Violence 19(2), 181–204 (2007)CrossRefGoogle Scholar
  30. 30.
    Landry, M., Malouin, J.L., Oral. M.: Model validation in operations research. Eur. J. Oper. Res. 14(3), 207–220 (1983)Google Scholar
  31. 31.
    Law, A.M., Kelton, W.D.: Simulation Modeling and Analysis, 2nd edn. McGraw-Hill, New York (1991)zbMATHGoogle Scholar
  32. 32.
    Ling, Y., Mahadevan, S.: Quantitative model validation techniques: new insights. Reliab. Eng. Syst. Saf. 111, 217–231 (2013)CrossRefGoogle Scholar
  33. 33.
    Liu, Y., Chen, W., Arendt, P., Huang, H.: Toward a better understanding of model validation metrics. J. Mech. Des. 133(7), 1–13(2011)Google Scholar
  34. 34.
    Louisa, N., Johnson, C.W.: Validation of Counter-terrorism Simulation Models. Available at http://www.dcs.gla.ac.uk/~louisa/Publications_files/ISSC09_Paper_2.pdf (2009). Accessed in Nov 2015
  35. 35.
    Mason, R., McInnis, B., Dalal, S.: Machine Learning for the Automatic Identification of Terrorist Incidents in Worldwide News Media. In: 2012 IEEE International Conference on Intelligence and Security Informatics (ISI), Washington, DC, pp. 84–89 (2012)Google Scholar
  36. 36.
    McCarl, B.A.: Model validation: an overview with some emphasis on risk models. Rev. Market. Agric. Econ. 52(3), 153–173 (1984)Google Scholar
  37. 37.
    McCarl, B.A., Spreen, T.H.: Validation of Programming Models. Available at http://agecon2.tamu.edu/people/faculty/mccarl-bruce/mccspr/new18.pdf (1997). Accessed in Nov 2015
  38. 38.
    Merrick, J., Parnell, G.S.: A comparative analysis of PRA and intelligent adversary methods for counterterrorism risk management. Risk Anal. 31(9), 1488–1510 (2011)CrossRefGoogle Scholar
  39. 39.
    Morral, A.R., Price, C.C., Ortiz, D.S., Wilson, B., LaTourrette, T., Mobley, B.W., McKay, S., Willis, H.H.: Modeling Terrorism Risk to the Air Transportation System: An Independent Assessment of TSA’s Risk Management Analysis Tool and Associated Methods. RAND report. Available at http://www.rand.org/content/dam/rand/pubs/monographs/2012/RAND_MG1241.pdf (2012). Accessed in Nov 2015
  40. 40.
    NRC: Review of the Department of Homeland Security’s Approach to Risk Analysis. Available at https://www.fema.gov/pdf/government/grant/2011/fy11_hsgp_risk.pdf (2010). Accessed in Nov 2015
  41. 41.
    NRC: Bioterrorism Risk Assessment. Biological Threat Characterization Center of the National Biodefense Analysis and Countermeasures Center. Fort Detrick, MD (2008)Google Scholar
  42. 42.
    Oberkampf, W., Barone, M.: Measures of agreement between computation and experiment: validation metrics. J. Comput. Phys. 217(1), 5–36 (2006)CrossRefzbMATHGoogle Scholar
  43. 43.
    Oberkampf, W., Trucano, T.: Verification and validation in computational fluid dynamics. Progr. Aerosp. Sci. 38(3), 209–272 (2002)CrossRefGoogle Scholar
  44. 44.
    Oberkampf, W.L.: Bibliography for Verification and Validation in Computational Simulation. Sandia Report (1998)Google Scholar
  45. 45.
    Oberkampf, W.L., Trucano, T.G., Hirsch, C.: Verification, validation, and predictive capability in computational engineering and physics. Appl. Mech. Rev. 57(5), 345–384 (2004)CrossRefGoogle Scholar
  46. 46.
    Oden, J.T.: A Brief View of Verification, Validation, and Uncertainty Quantification. Available at http://users.ices.utexas.edu/~serge/WebMMM/Talks/Oden-VVUQ-032610.pdf (2009). Accessed in Nov 2015
  47. 47.
    O’Keefe, R.M., O’Leary, D.E.: Expert system verification and validation: a survey and tutorial. Artif. Intell. Rev. 7, 3–42 (1993)CrossRefGoogle Scholar
  48. 48.
    Oliver, R.W.: What Is Transparency? McGraw-Hill, New York (2004)Google Scholar
  49. 49.
    Pace, D.K.: Modeling and simulation verification and validation challenges. Johns Hopkins APL Technical Digest. 25(2), 163–172 (2004)Google Scholar
  50. 50.
    Rakesh, K., Sarin, L. Keller, R.: From the editors: probability approximations, anti-terrorism strategy, and bull’s-eye display for performance feedback. Decis. Anal. 10(1), 1–5(2013)Google Scholar
  51. 51.
    Rebba, R., Mahadevan, S.: Validation of models with multivariate output. Reliab. Eng. Syst. Saf. 91(8), 861–871 (2006)CrossRefGoogle Scholar
  52. 52.
    Roach, P.J.: Verification and Validation in Computational Science and Engineering. Hermosa Publishers, Albuquerque (1998)Google Scholar
  53. 53.
    Salari, K., Knupp, P.: Code Verification by the Method of Manufactured Solutions. Sandia National Laboratories, SAND2000-1444 (2000)Google Scholar
  54. 54.
    Saltelli, A., Tarantola, S.: On the relative importance of input factors in mathematical models: safety assessment for nuclear waste disposal. J. Am. Stat. Assoc. 97(459), 702–709 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  55. 55.
    Sankararaman, S., Mahadevan, S.: Model validation under epistemic uncertainty. Reliab. Eng. Syst. Saf. 96(9), 1232–1241(2011)Google Scholar
  56. 56.
    Sargent, R.G.: An assessment procedure and a set of criteria for use in the evaluation of computerized models and computer-based modeling tools. Final technical report RADC-TR-80-409, U.S. Air Force (1981)Google Scholar
  57. 57.
    Sargent, R.G.: Some subjective validation methods using graphical displays of data. In: Proceedings of the 1996 Winter Simulation Conference, Coronado, California (1996)Google Scholar
  58. 58.
    Sargent, R.G.: Verification and validation of simulation models. In: Proceedings of the 2009 Winter Simulation Conference, Austin, Texas, pp. 162–176 (2009)Google Scholar
  59. 59.
    Schlesinger, S., Crosbie, R.E., Innis, G.S., Lalwani, C.S., Loch, J., Sylvester, R.J., Wright, R.D., Kheir, N., Bartos, D.: Terminology for model credibility. Simulation 32(3), 103–104 (1979)CrossRefGoogle Scholar
  60. 60.
    Shan, X., Zhuang, J.: Cost of equity in homeland security resource allocation in the face of a strategic attacker. Risk Anal. 33(6), 1083–1099 (2013)CrossRefGoogle Scholar
  61. 61.
    Shan, X., Zhuang, J.: Hybrid defensive resource allocations in the face of partially strategic attackers in a sequential defender-attacker game. Eur. J. Oper. Res. 228(1), 262–272 (2013)CrossRefzbMATHGoogle Scholar
  62. 62.
    Shieh, E., An, B., Yang, R., Tambe, M., Baldwin, C., DiRenzo, J., Maule, B., Meyer, G.: PROTECT: a deployed game theoretic system to protect the ports of the United States. In: AAMAS, Valencia, Spain (2012)Google Scholar
  63. 63.
    Sornette, D., Davis, A.B., Vixie, K.R.,Pisarenko, V., Kamm, J.R.: Algorithm for model validation: theory and applications. Proc. Natl. Acad. Sci. U. S. A. 104(16), 6562–6567 (2007)CrossRefGoogle Scholar
  64. 64.
    START: Global Terrorism Database[data file]. Available at http://www.start.umd.edu/gtd. Accessed in Nov 2015
  65. 65.
    Streetman, S.: The Art of the Possible in Validating Models of Adversary Behavior for Extreme Terrorist Acts. Presentation on First Conference on Validating Models of Adversary Behaviors, Buffalo (2013)Google Scholar
  66. 66.
    Tambe, M.: Security and Game Theory: Algorithms, Deployed Systems, Lessons Learned. Cambridge University Press, New York (2011)CrossRefzbMATHGoogle Scholar
  67. 67.
    Tambe, M., Shieh, E.: Stackelberg Games in Security Domains: Evaluating Effectiveness of Real-World Deployments. Presentation on First Conference on Validating Models of Adversary Behaviors, Buffalo (2013)Google Scholar
  68. 68.
    Tetlock, P.E., Gardner, D.: Superforecasting: The Art and Science of Prediction. Crown, New York (2015)Google Scholar
  69. 69.
    Terrorism in Western Europe: Events Data (TWEED). Available at http://folk.uib.no/sspje/tweed.htm. Accessed in Nov 2015
  70. 70.
    Thacker, B.H., Riha, D.S., Millwater, H.R., Enright, M.P.: Errors and uncertainties in probabilistic engineering analysis. In: Proceedings of the 42nd AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference and Exhibit, Seattle, Washington (2001)Google Scholar
  71. 71.
    Thacker, B.H., Doebling, S.W., Hemez, f. M., Anderson, M.C., Pepin, J.E., Rodriguez, E.A.: Concepts of Model Verification and validation. Available at http://www.ltas-vis.ulg.ac.be/cmsms/uploads/File/LosAlamos_VerificationValidation.pdf (2004). Accessed in Nov 2015
  72. 72.
    The Federal Emergency Management Agency (FEMA). Available at http://www.fema.gov/. Accessed in Nov 2015
  73. 73.
    The National Research Council (NRC). Available at http://www.nationalacademies.org/nrc/. Accessed in Nov 2015
  74. 74.
    Toubaline, S., Borrion, H., Sage, L.T.: Dynamic generation of event trees for risk modeling of terrorist attacks. In: 2012 IEEE Conference on Technologies for Homeland Security (HST), Waltham, MA, pp. 111–116 (2012)Google Scholar
  75. 75.
    Urschel, J., J. Zhuang.: Are NFL coaches risk and loss averse? Evidence from their use of kickoff strategies. J. Quant. Anal. Sports 7(3), Article 14(2011)Google Scholar
  76. 76.
    U.S. GAO: Guidelines for Model Evaluation. PAD-79-17, Washington, DC (1979)Google Scholar
  77. 77.
    U.S. Government Accountability Office (U.S. GAO). Available at http://www.gao.gov/. Accessed in Nov 2015
  78. 78.
    Zhang, J., Zhuang, J.: Modeling a Multi-period, Multi-target Attacker-defender Game with Multiple attack types. Working paper (2015)Google Scholar
  79. 79.
    Zhang, J., Zhuang, J.: Defending Remote Border Security with Sensors and UAVs based on Network Interdiction Methods. Working paper (2015)Google Scholar
  80. 80.
    Zhuang, J., Bier, V.: Balancing terrorism and natural disasters-defensive strategy with endogenous attacker effort. Oper. Res. 55(5), 976–991(2007)Google Scholar
  81. 81.
    Zhuang, J., Saxton, G., Wu, H.: Publicity vs. impact in nonprofit disclosures and donor preferences: a sequential game with one nonprofit organization and N donors. Ann. Oper. Res. 221(1), 469–491(2014)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  1. 1.Department of Industrial and Systems EngineeringNew York State University at BuffaloBuffaloUSA

Personalised recommendations