Advertisement

Improving and Measuring Learning Effectiveness at Cyber Defense Exercises

  • Kaie Maennel
  • Rain Ottis
  • Olaf MaennelEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10674)

Abstract

Cyber security exercises are believed to be the most effective training for the training audiences from top professional teams to individual students. However, evidence of learning outcomes is often anecdotal and not validated. This paper focuses on measuring learning outcomes of technical cyber defense exercises (CDXs) with Red and Blue teaming elements. We studied learning at Locked Shields, which is the largest unclassified defensive live-fire CDX in the world. This paper proposes a novel and simple methodology, called the “5-timestamp methodology”, aiming at accommodating both effective feedback (including benchmarking) and learning measurement. The methodology focuses on collection of timestamps at specific points during a cyber incident and time interval analysis to assess team performance, and argues that changes in performance over time can be used to evidence learning. The timestamps can either be collected non-intrusively from raw network traces (such as pcaps, logs) or using traditional methods, such as interviews, observations and surveys. Our experience showed that traditional methods, such as self-reporting, fail at high-speed and complex exercises. The suggested method enhances feedback loop, allows identifying learning design flaws, and provides evidence of learning value for CDXs.

Keywords

Cyber defence exercise Training and education Learning outcomes Measuring learning 

Notes

Acknowledgments

This work would not have taken place without the NATO CCD COE open-minded and friendly organizing team of LS17, who allowed the authors to experiment on this large cyber exercise.

References

  1. 1.
    Ahmad, A.: A cyber exercise post assessment framework. Malaysia perspectives. Ph.D. thesis, University of Glasgow (2016)Google Scholar
  2. 2.
    Ahmad, A., Johnson, C., Storer, T.: A cyber exercise post assessment: adoption of the Kirkpatrick model. Adv. Inf. Sci. Serv. Sci. 7(2), 1 (2015)Google Scholar
  3. 3.
    Connolly, T.M., Boyle, E.A., MacArthur, E., Hainey, T., Boyle, J.M.: A systematic literature review of empirical evidence on computer games and serious games. Comput. Educ. 59(2), 661–686 (2012)CrossRefGoogle Scholar
  4. 4.
    Edmondson, A.C.: The local and variegated nature of learning in organizations: a group-level perspective. Organ. Sci. 13(2), 128–146 (2002)CrossRefGoogle Scholar
  5. 5.
    Girard, C., Ecalle, J., Magnan, A.: Serious games as new educational tools: how effective are they? A meta-analysis of recent studies. J. Comput. Assist. Learn. 29(3), 207–219 (2013)CrossRefGoogle Scholar
  6. 6.
    Granasen, M., Andersson, D.: Measuring team effectiveness in cyber-defense exercises-a cross-disciplinary case study. Cogn. Technol. Work 18(1), 121–143 (2016). Springer-Verlag, LondonCrossRefGoogle Scholar
  7. 7.
    Hauge, J.B., Boyle, E., Mayer, I., Nadolski, R., Riedel, J.C., Moreno-Ger, P., Bellotti, F., Lim, T., Ritchie, J.: Study design and data gathering guide for serious games’ evaluation. In: Connolly, T.M., Hainey, T., Boyle, E., Baxter, G., Moreno-Ger, P. (eds.) Psychology, Pedagogy, and Assessment in Serious Games, pp. 394–419 (2014)Google Scholar
  8. 8.
    Hay, D.B.: Using concept maps to measure deep, surface and non-learning outcomes. Stud. High. Educ. 32(1), 39–57 (2007)CrossRefGoogle Scholar
  9. 9.
    Henshel, D.S., Deckard, G.M., Lufkin, B., Buchler, N., Hoffman, B., Rajivan, P., Collman, S.: Predicting proficiency in cyber defense team exercises. In: 2016 IEEE Military Communications Conference, MILCOM 2016, pp. 776–781. IEEE (2016)Google Scholar
  10. 10.
    Kick, J.: Cyber exercise playbook. Technical report, DTIC Document (2014)Google Scholar
  11. 11.
    Mattson, J.A.: Cyber defense exercise: a service provider model. In: Futcher, L., Dodge, R. (eds.) Fifth World Conference on Information Security Education. IFIP – International Federation for Information Processing, vol. 237, pp. 81–86. Springer, Boston (2007)CrossRefGoogle Scholar
  12. 12.
    Mayer, I., Bekebrede, G., Warmelink, H., Zhou, Q.: A brief methodology for researching and evaluating serious games and game-based learning. In: Psychology, Pedagogy, and Assessment in Serious Games, pp. 357–393. IGI Global (2014)Google Scholar
  13. 13.
    Mudge, R.: Cobalt Strike. https://www.cobaltstrike.com. Accessed 18 Sept 2017
  14. 14.
    NATO CCD COE: Locked Shields (2017). https://ccdcoe.org/locked-shields-2017.html. Accessed 18 Sept 2017
  15. 15.
    NATO CCD COE: Locked Shields 2016 After Action Report. NATO Cooperative Cyber Defence Centre of Excellence Publication (2016)Google Scholar
  16. 16.
    Newman, D.R., Webb, B., Cochrane, C.: A content analysis method to measure critical thinking in face-to-face and computer supported group learning. Interpersonal Comput. Technol. 3(2), 56–77 (1995)Google Scholar
  17. 17.
    Ogee, A., Gavrila, R., Trimintzios, P., Stavropoulos, V., Zacharis, A.: The 2015 Report on National and International Cyber Security Exercises. https://www.enisa.europa.eu/publications/latest-report-on-national-and-international-cyber-security-exercises
  18. 18.
    Patriciu, V.-V., Furtuna, A.C.: Guide for designing cyber security exercises. In: Proceedings of the 8th WSEAS International Conference on E-Activities and Information Security and Privacy. World Scientific and Engineering Academy and Society (WSEAS), pp. 172–177 (2009)Google Scholar
  19. 19.
    Pusey, P., Gondree, M., Peterson, Z.: The outcomes of cybersecurity competitions and implications for underrepresented populations. IEEE Secur. Priv. 14(6), 90–95 (2016)CrossRefGoogle Scholar
  20. 20.
    Reed, T., Nauer, K., Silva, A.: Instrumenting competition-based exercises to evaluate cyber defender situation awareness. In: Schmorrow, D.D., Fidopiastis, C.M. (eds.) AC 2013. LNCS, vol. 8027, pp. 80–89. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-39454-6_9 CrossRefGoogle Scholar
  21. 21.
    Silva, A., McClain, J., Reed, T., Anderson, B., Nauer, K., Abbott, R., Forsythe, C.: Factors impacting performance in competitive cyber exercises. In: Proceedings of the Interservice/Interagency Training, Simulation and Education Conference, Orlando, FL (2014)Google Scholar
  22. 22.
    Singer, M.J., Knerr, B.W.: Evaluation of a game-based simulation during distributed exercises. Army Research Institute for the Behavioral and Social Sciences, Orlando, FL (2010)Google Scholar
  23. 23.
    Stytz, M.R., Banks, S.B.: Addressing simulation issues posed by cyber warfare technologies. SCS M&S Mag., no. 3 (2010)Google Scholar
  24. 24.
    Svetlik, I., Stavrou-Costea, E., Chiva, R., Alegre, J., Lapiedra, R.: Measuring organisational learning capability among the workforce. Int. J. Manpower 28(3/4), 224–242 (2007)CrossRefGoogle Scholar
  25. 25.
    Uzumeri, M., Nembhard, D.: A population of learners: a new way to measure organizational learning. J. Oper. Manage. 16(5), 515–528 (1998)CrossRefGoogle Scholar
  26. 26.
    Wilson, J.M., Goodman, P.S., Cronin, M.A.: Group learning. Acad. Manag. Rev. 32(4), 1041–1059 (2007)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Tallinn University of TechnologyTallinnEstonia

Personalised recommendations