Advertisement

Evaluation Systems for Biological Security Risk Mitigation Training and Education

  • Giulio Maria ManciniEmail author
  • James Revill
Chapter
  • 24 Downloads

Abstract

Engagement, training and education have been promoted as ways to address and mitigate biological security risks. The rationale of such a promotion may be that attitudes and actions of more aware, competent and capable people will lead to lower biological security risks. However, evaluations of the impact of experiences of engagement, training and education have often been limited to ex-post observation rather than based on structured educational designs which link capacity building efforts with desired learning outcomes and focus on longer-term impacts on security risks. This paper proposes a methodology for evaluating the impact of education or training, particularly looking at security-related biorisks. The methodology seeks to evaluate the impact of education and training as a risk mitigation measure, linking (expected) outcomes in terms of learning to (expected) outcomes in terms of risk. The paper leverages models of Instructional Systems Design (ISD) and of levels of impact of learning as well as approaches of risk management, risk assessment and risk evaluation. The paper proposes evaluation strategies and tools for measuring the impact of training and education aimed at reducing biological security risks, including examples of designed indicators, metrics and data sources. Improved competencies identified in four thematic areas, built with education and training and transferred into behaviour, have the potential to reduce biological security risks in specific risk scenarios via impacting factors that primarily influence risk likelihood. This paper may be of value to those in the security community seeking to enhance work on metrics and measures, and provides a theoretical framework through which projects and programs might be better evaluated.

References

  1. Alliger, G.M., S.I. Tannenbaum, W. Bennett, H. Traver, and A. Shotland. 1997. A meta-analysis of the relations among training criteria. Personnel Psychology 50 (2): 341–358.CrossRefGoogle Scholar
  2. Anderson, F.W., S.A. Obed, E.L. Boothman, and H. Opare-Ado. 2014. The public health impact of training physicians to become obstetricians and gynecologists in Ghana. American Journal of Public Health 104 (S1): S159–S165.CrossRefGoogle Scholar
  3. Andrews, D.H., and L.A. Goodson. 1980. A comparative analysis of models of instructional design. Journal of Instructional Development 3 (4): 2–16.CrossRefGoogle Scholar
  4. Apostolakis, G.E. 2004. How useful is quantitative risk assessment? Risk Analysis 24 (3): 515–520.CrossRefGoogle Scholar
  5. Arthur, W., Jr., W. Bennett Jr., P.S. Edens, and S.T. Bell. 2003. Effectiveness of training in organizations: A meta-analysis of design and evaluation features. Journal of Applied Psychology 88 (2): 234.CrossRefGoogle Scholar
  6. Astuto Gribble, L., E. Sangalang Tria, and L. Wallis. 2015. The AMP model. In Laboratory biorisk management: Biosafety and biosecurity, ed. R.M. Salerno and J.M. Gaudioso, 31–44. Boca Raton/London/New York: CRC Press, Taylor & Francis Group.CrossRefGoogle Scholar
  7. Australia, Canada, Japan, New Zealand, Republic of Korea (on behalf of the ‘JACKSNNZ’), Switzerland, Kenya, Sweden and Ukraine. 2011. Possible approaches to education and awareness raising among life scientists. Working Paper. BWC/CONF.VII/WP.20. Geneva: Biological and Toxin Weapons Convention.Google Scholar
  8. Bates, R. 2004. A critical analysis of evaluation practice: The Kirkpatrick model and the principle of beneficence. Evaluation and Program Planning 27 (3): 341–347.CrossRefGoogle Scholar
  9. Bloom, B.S. 1956. Taxonomy of educational objectives: The classification of educational goals. Philadelphia: McKay.Google Scholar
  10. Bonner, J. 1982. Systematic lesson design for adult learners. Journal of Instructional Development 6 (1): 34–42.CrossRefGoogle Scholar
  11. Boyer, E.L., and F.M. Hechinger. 1981. Higher learning in the Nation’s service. A Carnegie Foundation Essay. ERIC, Washington, DC.Google Scholar
  12. Bradbury, J.A. 1989. The policy implications of differing concepts of risk. Science, Technology & Human Values 14 (4): 380–399.CrossRefGoogle Scholar
  13. BTWC. 1986. Second review conference of the parties to the convention on the prohibition of the development, production and stockpiling of bacteriological (biological) and Toxin Weapons and on their Destruction. Final Document. Part II. Final Declaration. BWC/CONF.II/13/II. Geneva: Biological and Toxin Weapons Convention.Google Scholar
  14. ———. 2008. Meeting of states parties to the convention on the prohibition of the development, production, and stockpiling of bacteriological (biological) and toxin weapons and on their destruction. Report. BWC/MSP/2008/5. Geneva: Biological and Toxin Weapons Convention.Google Scholar
  15. ———. 2012. Meeting of states parties to the convention on the prohibition of the development, production, and stockpiling of bacteriological (biological) and Toxin Weapons and on their Destruction. Report. BWC/MSP/2012/5. Geneva: Biological and Toxin Weapons Convention.Google Scholar
  16. Buganza, T., M. Kalchschmidt, E. Bartezzaghi, and D. Amabile. 2013. Measuring the impact of a major project management educational program: The PMP case in Finmeccanica. International Journal of Project Management 31 (2): 285–298.CrossRefGoogle Scholar
  17. Caskey, S., and E.E. Sevilla-Reys. 2015. Risk assessment. In Laboratory biorisk management: Biosafety and biosecurity, ed. R.M. Salerno and J.M. Gaudioso, 45–64. Boca Raton/London/New York: CRC Press, Taylor & Francis Group.CrossRefGoogle Scholar
  18. Caskey, S., J. Gaudioso, R.M. Salerno, S. Wagener, M. Shigematsu, G. Risi, J. Kozlovac, and V. Halkjaer-Knudsen. 2010. Biosafety risk assessment methodology. SAND2010-6487. Albuquerque: Sandia National Laboratories.CrossRefGoogle Scholar
  19. Dando, M.R., and B. Rappert. 2005. Codes of conduct for the life sciences: Some insights from UK Academia. Bradford briefing papers on strenghtening the biological weapons convention, Second Series. 16(Second Series).Google Scholar
  20. Depoy, J., J. Phelan, P. Sholander, B. Smith, G.B. Varnado, and G. Wyss. 2005. Risk assessment for physical and cyber attacks on critical infrastructures. In MILCOM 2005-2005 IEEE Military Communications Conference, 1961–1969. IEEE.Google Scholar
  21. Deutscher Ethikrat. 2014. Biosecurity — Freedom and responsibility of research. Berlin: German Ethics Council. Available from: http://www.ethikrat.org/files/opinion-biosecurity.pdf.Google Scholar
  22. Edmonds, G.S., R.C. Branch, and P. Mukherjee. 1994. A conceptual framework for comparing instructional design models. Educational Technology Research and Development 42 (4): 55–72.CrossRefGoogle Scholar
  23. G8. 2009. L’Aquila Statement on Non Proliferation ANNEX B: Recommendations for a coordinated approach in the field of Global WMD Knowledge proliferation and scientists engagement. L’Aquila: G8 Global Partnership. [Accessed 18 August 2016]. Available from: http://www.g8italia.it/static/G8_Allegato/2._LAquila_Statent_on_Non_proliferation.pdf.
  24. Garrick, B.J. 2008. Quantifying and controlling catastrophic risks. Waltham: Academic Press.Google Scholar
  25. Germany. 2005. Codes of conduct and their application in the life sciences at universities. Working Paper. BWC/MSP/2005/MX/WP.12. Geneva: Biological and Toxin Weapons Convention.Google Scholar
  26. Grainger, L., and D. Turegeldiyeva. 2015. Biorisk management training. In Laboratory biorisk management: Biosafety and biosecurity, ed. R.M. Salerno and J.M. Gaudioso, 101–124. Boca Raton/London/New York: CRC Press, Taylor & Francis Group.CrossRefGoogle Scholar
  27. Gustafson, K.L., and R.M. Branch. 2002. Survey of instructional development models. 4th ed. Syracuse: ERIC Clearinghouse on Information & Technology.Google Scholar
  28. Hadden, S.G. 1984. Introduction: Risk policy in American institutions. In Risk analysis, institutions, and public policy, ed. S.G. Hadden, 3–17. Port Washington: Associated Faculty Press.Google Scholar
  29. Halpern, D.F., and M.D. Hakel. 2003. Applying the science of learning to the university and beyond: Teaching for long-term retention and transfer. Change: The Magazine of Higher Learning 35 (4): 36–41.CrossRefGoogle Scholar
  30. Hodell, C. 2011. ISD from the ground up: A no-nonsense approach to instructional design. 3rd ed. Alexandria: American Society for Training & Development.Google Scholar
  31. IAP. 2005. IAP statement on biosecurity. Trieste: InterAcademy Panel. [Accessed 18 Aug 2016]. Available from: http://www.interacademies.net/File.aspx?id=5401.
  32. ———. 2014. IAP statement on realising global potential in synthetic biology: Scientific opportunities and good governance. Trieste: IAP – the global network of science academies.Google Scholar
  33. India. 2005. Indian initiatives on codes of conduct for scientists. Working Paper. BWC/MSP/2005/MX/WP.23. Geneva: Biological and Toxin Weapons Convention.Google Scholar
  34. Japan. 2008. Oversight, education, awareness raising, and codes of conduct for preventing the Misuse of Bio-Science and Bio-Technology. Working Paper. BWC/MSP/2008/MX/WP.21. Geneva: Biological and Toxin Weapons Convention.Google Scholar
  35. Kaplan, S. 1997. The words of risk analysis. Risk Analysis 17 (4): 407–417.MathSciNetCrossRefGoogle Scholar
  36. Kaplan, S., and B.J. Garrick. 1981. On the quantitative definition of risk. Risk Analysis 1 (1): 11–27.CrossRefGoogle Scholar
  37. Kates, R.W., and J.X. Kasperson. 1983. Comparative risk analysis of technological hazards (a review). Proceedings of the National Academy of Sciences 80 (22):7027–7038.CrossRefGoogle Scholar
  38. Kidd, J.R. 1974. How adults learn. Cambridge: Cambridge Book Co.Google Scholar
  39. Kirkpatrick, D.L. 1979. Techniques for evaluating training programs. Training and Development: 178–192.Google Scholar
  40. Kirkpatrick, D.L., and J. Kirkpatrick. 2006. Evaluating training programs. 3rd ed. San Francisco: Berrett-Koehler Publishers.Google Scholar
  41. ———. 2007. Implementing the four levels: A practical guide for effective evaluation of training programs. San Francisco: Berrett-Koehler Publishers.Google Scholar
  42. Knight-Jones, T.J.D., and J. Rushton. 2013. The economic impacts of foot and mouth disease – What are they, how big are they and where do they occur? Preventive Veterinary Medicine 112 (3–4): 161–173.CrossRefGoogle Scholar
  43. Knowles, M.S. 1984. The adult learner: A neglected species. 3rd ed. Houston: Gulf Publishing.Google Scholar
  44. Kontoghiorghes, C. 2004. Reconceptualizing the learning transfer conceptual framework: Empirical validation of a new systemic model. International Journal of Training and Development 8 (3): 210–221.CrossRefGoogle Scholar
  45. Krathwohl, D.R. 2002. A revision of Bloom’s taxonomy: An overview. Theory Into Practice 41 (4): 212–218.CrossRefGoogle Scholar
  46. Mancini, G.M., and A. Fasani. 2012. Experiences in promoting awareness on biosecurity and dual use issues in European Universities. In Yearbook of biosecurity education 2012, ed. J.F. Sture, 76–86. Bradford: University of Bradford.Google Scholar
  47. Mancini, G.M., and J. Revill. 2017. “We”re Doomed!’ a critical assessment of risk framing around chemical and biological weapons in the twenty-first century. In Cyber and chemical, biological, radiological, nuclear, explosives challenges: Threats and counter efforts, ed. M. Martellini and A. Malizia, 311–325. Cham: Springer International Publishing.CrossRefGoogle Scholar
  48. McCombs, B.L. 1986. ERIC/ECTJ Annual Review Paper: The Instructional Systems Development (ISD) Model: A review of those factors critical to its successful implementation. Educational Communication and Technology, pp.67–81.Google Scholar
  49. Minehata, M., and N. Shinomiya. 2010. Chapter 5: Japan: Obstacles, lessons and future. In Education and ethics in the life sciences, ed. B. Rappert. Canberra: ANU E Press.Google Scholar
  50. Molenda, M. 2003. In search of the elusive ADDIE model. www.ispi.org. 3(May/June 2003), pp. 34–36.
  51. NASEM (National Academies of Sciences, Engineering, and Medicine). 2018. Governance of dual use research in the life sciences: Advancing global consensus on research Oversight: Proceedings of a workshop. Washington, D.C.: The National Academies Press.  https://doi.org/10.17226/25154.CrossRefGoogle Scholar
  52. National Research Council. 2010. Challenges and opportunities for education about dual use issues in the life sciences. Washington, D.C.: National Academies Press.Google Scholar
  53. Netherlands. 2015. Statement on science and Technology, 12-8-2015. Ms. Ayse Aydin, Senior policy adviser, ministry of foreign Affairs of the Kingdom of the Netherlands. 2015 Meeting of Experts of the States Parties of the biological and Toxin Weapons Convention. Geneva: Biological and Toxin Weapons Convention.Google Scholar
  54. Novossiolova, T., G.M. Mancini, and M.R. Dando. 2013. Effective and sustainable biosecurity education for those in the life sciences: The benefits of active learning. Bradford Briefing Papers on Strenghtening the Biological Weapons Convention, Third Series. 7(Third Series).Google Scholar
  55. NSABB. 2007. Proposed framework for the oversight of dual use life science research: Strategies for minimizing the potential misuse of research information. Available from: http://osp.od.nih.gov/sites/default/files/biosecurity_PDF_Framework%20for%20transmittal%200807_Sept07.pdf.
  56. OECD. 2007. OECD Best practice guidelines for biological resource centres. Paris: Organization for Economic Cooperation and Development Publishing.Google Scholar
  57. Patel, V., N. Yoskowitz, and J. Arocha. 2009. Towards effective evaluation and reform in medical education: A cognitive and learning sciences perspective. Advances in Health Sciences Education 14 (5): 791–812.CrossRefGoogle Scholar
  58. Praslova, L. 2010. Adaptation of Kirkpatrick’s four level model of training criteria to assessment of learning outcomes and program evaluation in Higher Education. Educational Assessment, Evaluation and Accountability 22 (3): 215–225.CrossRefGoogle Scholar
  59. Rappert, B. 2007. Education for the life sciences: Choices and challenges. In A web of prevention: “Biological Weapons, life sciences and the governance of research”, 51–66. London: Routledge.Google Scholar
  60. Renn, O. 2006. Risk governance. Towards an integrative approach. White Paper No. 1. Geneva: International Risk Governance Council.Google Scholar
  61. Revill, J., M. Candia Carnevali, A. Fosberg, Z.K. Shinwari, J. Rath, and G.M. Mancini. 2012. Lessons learned from implementing education on dual-use in Austria, Italy, Pakistan and Sweden. Medicine, Conflict, and Survival 28 (1): 31–44.CrossRefGoogle Scholar
  62. Russian Federation. 2005. Some reflections on the Ethic Norms and codes of conduct for scientists majoring in Biosciences. Working Paper. BWC/MSP/2005/MX/WP.18. Geneva: Biological and Toxin Weapons Convention.Google Scholar
  63. Salas, E., and J.A. Cannon-Bowers. 2001. The science of training: A decade of progress. Annual Review of Psychology 52 (1): 471–499.CrossRefGoogle Scholar
  64. Sax, L.J. 2004. Citizenship development and the American college student. New Directions for Institutional Research 2004 (122): 65–80.CrossRefGoogle Scholar
  65. Steensma, H., and K. Groeneveld. 2010. Evaluating a training using the ‘four levels model’. Journal of Workplace Learning 22 (5): 319–331.CrossRefGoogle Scholar
  66. Stern, P.C., and H.V. Fineberg. and others1996. Understanding risk: Informing decisions in a democratic society. Washington, D.C.: National Academies Press.Google Scholar
  67. Whitby, S., and M.R. Dando. 2010. Biosecurity awareness-raising and education for life scientists: What should be done now? In Education and ethics in the life sciences. Canberra: ANU E Press.Google Scholar
  68. WHO. 2010. Responsible life sciences research for global health security. Geneva: World Health Organization. Available from: http://www.who.int/csr/resources/publications/HSE_GAR_BDP_2010_2/en/index.html.Google Scholar
  69. Young, S., H.H. Willis, M. Moore, J.G. Engstrom, and National Defense Research Institute, Acquisition and Technology Policy Center and Cooperative Threat Reduction Program of the U.S. Department of Defense. 2014. Measuring Cooperative Biological Engagement Program (CBEP) performance: Capacities, capabilities, and sustainability enablers for biorisk management and biosurveillance. Washington, D.C.: National Academies Press.Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Directorate-General on Migration and Home Affairs of the European CommissionBrusselsBelgium
  2. 2.Harvard Sussex ProgramSPRU and University of SussexBrightonUK

Personalised recommendations