Advertisement

Cyber Security Exercises and Competitions as a Platform for Cyber Security Experiments

  • Teodor Sommestad
  • Jonas Hallberg
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7617)

Abstract

This paper discusses the use of cyber security exercises and competitions to produce data valuable for security research. Cyber security exercises and competitions are primarily arranged to train participants and/or to offer competence contests for those with a profound interest in security. This paper discusses how exercises and competitions can be used as a basis for experimentation in the security field. The conjecture is that (1) they make it possible to control a number of variables of relevance to security and (2) the results can be used to study several topics in the security field in a meaningful way. Among other things, they can be used to validate security metrics and to assess the impact of different protective measures on the security of a system.

Keywords

research method data collection security competitions security exercises 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Verendel, V.: Quantified security is a weak hypothesis: a critical survey of results and assumptions. In: New Security Paradigms Workshop, pp. 37–50 (2009)Google Scholar
  2. 2.
    Geer Jr., D., Hoo, K.S., Jaquith, A.: Information security: why the future belongs to the quants. IEEE Security & Privacy 1, 24–32 (2003)CrossRefGoogle Scholar
  3. 3.
    Kotulic, A., Clark, J.G.: Why there aren’t more information security research studies. Information & Management 41, 597–607 (2004)CrossRefGoogle Scholar
  4. 4.
    Gal-Or, E., Ghose, A.: The Economic Incentives for Sharing Security Information. Information Systems Research 16, 186–208 (2005)CrossRefGoogle Scholar
  5. 5.
    Gordon, L.: Sharing information on computer systems security: An economic analysis. Journal of Accounting and Public Policy 22, 461–485 (2003)CrossRefGoogle Scholar
  6. 6.
    Wilander, J., Nikiforakis, N., Younan, Y., Kamkar, M., Joosen, W.: RIPE: Runtime Intrusion Prevention Evaluator. In: Proceedings of the 27th Annual Computer Security Applications Conference, ACSAC, pp. 41–50 (2011)Google Scholar
  7. 7.
    Shacham, H., Page, M., Pfaff, B., Goh, E.J., Modadugu, N., Boneh, D.: On the effectiveness of address-space randomization. In: Proceedings of the 11th ACM Conference on Computer and Communication Security, pp. 298–307 (2004)Google Scholar
  8. 8.
    Khattab, S.M., Sangpachatanaruk, C., Melhem, R., Znati, T.: Proactive server roaming for mitigating denial-of-service attacks. In: Proceedings of International Conference on Information Technology: Research and Education, ITRE 2003, pp. 286–290. IEEE (2003)Google Scholar
  9. 9.
    Ktata, F.B., Kadhi, N.E., Ghédira, K.: Agent IDS based on Misuse Approach. Journal of Software 4, 495–507 (2009)CrossRefGoogle Scholar
  10. 10.
    Conti, G., Babbitt, T., Nelson, J.: Hacking Competitions and Their Untapped Potential for Security Education. IEEE Security & Privacy, 56–59 (2011)Google Scholar
  11. 11.
    Fanelli, R.L., O’Connor, T.J.: Experiences with practice-focused undergraduate security education. In: Proceedings of the 3rd Workshop on Cyber Security, Washington, DC, United states (2010)Google Scholar
  12. 12.
    Werther, J., Zhivich, M., Leek, T.: Experiences in cyber security education: The mit lincoln laboratory capture-the-flag exercise. In: The 4th Workshop on Cyber Secuirty Experimentation and Test, San Francisco, CA, United states (2011)Google Scholar
  13. 13.
    Vigna, G.: The UCSB iCTF, http://ictf.cs.ucsb.edu/
  14. 14.
    Polytechnic Institute of NYU: CSAW - CyberSecurity Competition, http://www.poly.edu/csaw2011
  15. 15.
    Cyber Security Challenge: Cyber Security Challange, https://cybersecuritychallenge.org.uk/
  16. 16.
    National Collegiate Cyber Defense Competition: Welcom to the National Collegiate Cyber Defense Competition, http://www.nationalccdc.org/
  17. 17.
    Patriciu, V.V., Furtuna, A.C.: Guide for designing cyber security exercises. In: Proceedings of the 8th WSEAS International Conference on E-Activities and Information Security and Privacy, pp. 172–177. World Scientific and Engineering Academy and Society, WSEAS (2009)Google Scholar
  18. 18.
    Wagner, P.J., Wudi, J.M.: Designing and implementing a cyberwar laboratory exercise for a computer security course. In: Proceedings of the 35th SIGCSE Technical Symposium on Computer Science Education - SIGCSE 2004, p. 402 (2004)Google Scholar
  19. 19.
    Schepens, W.J., Ragsdale, D.J., Surdu, J.R., Schafer, J.: The Cyber Defense Exercise: An evaluation of the effectiveness of information assurance education. The Journal of Information Security 1 (2002)Google Scholar
  20. 20.
    Conklin, A.: Cyber Defense Competitions and Information Security Education: An Active Learning Solution for a Capstone Course. In: Proceedings of the 39th Annual Hawaii International Conference on System Sciences (HICSS 2006), p. 220b. IEEE (2006)Google Scholar
  21. 21.
    Hoffman, L.J., Rosenberg, T., Dodge, R., Ragsdale, D.: Exploring a National Cybersecurity Exercise for Universities. IEEE Security and Privacy Magazine, 27–33 (2005)Google Scholar
  22. 22.
    Childers, N., Boe, B., Cavallaro, L., Cavedon, L., Cova, M., Egele, M., Vigna, G.: Organizing Large Scale Hacking Competitions. In: Kreibich, C., Jahnke, M. (eds.) DIMVA 2010. LNCS, vol. 6201, pp. 132–152. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  23. 23.
    Schepens, W.J., James, J.R.: Architecture of a cyber defense competition. In: IEEE International Conference on Systems, Man and Cybernetics, pp. 4300–4305. IEEE (2003)Google Scholar
  24. 24.
    Keppel, G., Wickens, T.D.: Design and analysis: a researcher’s handbook. Pearson Education, Upper Saddle River (2004)Google Scholar
  25. 25.
    McQueen, M.A., Boyer, W.F., Flynn, M.A., Beitel, G.A.: Time-to-Compromise Model for Cyber Risk Reduction Estimation. In: Gollmann, D., Massacci, F., Yautsiukhin, A. (eds.) Quality of Protection, pp. 49–64. Springer US, Boston (2006)CrossRefGoogle Scholar
  26. 26.
    Jonsson, E., Olovsson, T.: A quantitative model of the security intrusion process based on attacker behavior. IEEE Transactions on Software Engineering 23, 235–245 (1997)CrossRefGoogle Scholar
  27. 27.
    Schudel, G., Wood, B., Parks, R.: Modeling behavior of the cyber-terrorist. In: Proceeding of Workshop on RAND National Security Research Division, pp. 45–59 (2000)Google Scholar
  28. 28.
    Branlat, M., Morison, A.: Challenges in managing uncertainty during cyber events: Lessons from the staged-world study of a large-scale adversarial cyber security exercise. In: Human Systems Integration Symposium (2011)Google Scholar
  29. 29.
    Olovsson, T., Jonsson, E., Brocklehurst, S., Littlewood, B.: Data collection for security fault forecasting: Pilot experiment, Dept. of Computer Eng., Chalmers Univ. of Technology, and ESPRIT/BRA Project no. 6362 (PDCS2), Toulouse (1993)Google Scholar
  30. 30.
    Levin, D.: Lessons learned in using live red teams in IA experiments. In: Proceedings of DARPA Information Survivability Conference and Exposition, pp. 110–119. IEEE (2003)Google Scholar
  31. 31.
    Kewley, D.L., Bouchard, J.F.: DARPA Information Assurance Program dynamic defense experiment summary. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans 31, 331–336 (2001)CrossRefGoogle Scholar
  32. 32.
    Guard, L., Crossland, M., Paprzycki, M., Thomas, J.: Developing an empirical study of how qualified subjects might be selected for IT system security penetration testing. Citeseer 2, 413–424 (2004)Google Scholar
  33. 33.
    Dodge, R.C., Carver, C., Ferguson, A.J.: Phishing for user security awareness. Computers & Security 26, 73–80 (2007)CrossRefGoogle Scholar
  34. 34.
    Kraemer, S., Carayon, P., Duggan, R.: Red team performance for improved computer security. In: Human Factors and Ergonomics Society Annual Meeting Proceedings. Human Factors and Ergonomics Society, pp. 1605–1609 (2004)Google Scholar
  35. 35.
    Mirkovic, J., Reiher, P., Papadopoulos, C., Hussain, A., Shepard, M., Berg, M., Jung, R.: Testing a Collaborative DDoS Defense In a Red Team/Blue Team Exercise. IEEE Transactions on Computers 57, 1098–1112 (2008)MathSciNetCrossRefGoogle Scholar
  36. 36.
    Kewley, D.L., Lowry, J.: Observations on the effects of defense in depth on adversary behavior in cyber warfare. In: Proceedings of the IEEE SMC Information Assurance Workshop, pp. 1–8 (2001)Google Scholar
  37. 37.
    Mitropoulos, S., Patsos, D., Douligeris, C.: On Incident Handling and Response: A state-of-the-art approach. Computers & Security 25, 351–370 (2006)CrossRefGoogle Scholar
  38. 38.
    Werlinger, R., Muldner, K., Hawkey, K., Beznosov, K.: Preparation, detection, and analysis: the diagnostic work of IT security incident response. Information Management & Computer Security 18, 26–42 (2010)CrossRefGoogle Scholar
  39. 39.
    Meyers, M.: Computer forensics: the need for standardization and certification. International Journal of Digital Evidence 3, 1–11 (2004)Google Scholar
  40. 40.
    Sommestad, T., Hunstad, A.: Intrusion detection and the role of the system administrator. In: Proceedings of International Symposium on Human Aspects of Information Security & Assurance, Crete, Greece (2012)Google Scholar
  41. 41.
    Holm, H., Sommestad, T., Franke, U., Ekstedt, M.: Success rate of remote code execution attacks – expert assessments and observations. Journal of Universal Computer Science 18, 732–749 (2012)Google Scholar
  42. 42.
    Holm, H., Ekstedt, M., Andersson, D.: Empirical analysis of system-level vulnerability metrics through actual attacks. IEEE Transactions on Dependable and Secure Computing (accepted, 2012)Google Scholar
  43. 43.
    Egele, M., Caillat, B., Stringhini, G.: Hit’em where it hurts: a live security exercise on cyber situational awareness. Computer Security (2011)Google Scholar
  44. 44.
    Schudel, G., Wood, B.: Adversary work factor as a metric for information assurance. In: Proceedings of the 2000 Workshop on New Security Paradigms, pp. 23–30. ACM (2001)Google Scholar
  45. 45.
    Levin, D.: Lessons learned in using live red teams in IA experiments. In: Proceedings DARPA Information Survivability Conference and Exposition, pp. 110–119 (2003)Google Scholar
  46. 46.
    Ryder, D., Levin, D., Lowry, J.: Defense in depth: A focus on protecting the endpoint clients from network attack. In: Proceedings of the IEEE SMC Information Assurance Workshop (2002)Google Scholar
  47. 47.
    Paulauskas, N., Garsva, E.: Attacker skill level distribution estimation in the system mean time-to-compromise. In: 1st International Conference on Information Technology, IT 2008, pp. 1–4. IEEE (2008)Google Scholar
  48. 48.
    Leversage, D., Byres, E.: Comparing Electronic Battlefields: Using Mean Time-To-Compromise as a Comparative Security Metric. Computer Network Security 1, 213–227 (2007)CrossRefGoogle Scholar
  49. 49.
    McQueen, M., Boyer, W., Flynn, M., Beitel, G.: Time-to-compromise model for cyber risk reduction estimation. Quality of Protection (2006)Google Scholar
  50. 50.
    McHugh, J.: Quality of protection: Measuring the unmeasurable? In: Proceedings of the 2nd ACM Workshop on Quality of Protection, QoP 2006, Co-located with the 13th ACM Conference on Computer and Communications Security, CCS 2006, Alexandria, VA, pp. 1–2 (2006)Google Scholar
  51. 51.
    Sommestad, T., Holm, H., Ekstedt, M.: Effort estimates for vulnerability discovery projects. In: HICSS 2012: Proceedings of the 45th Hawaii International Conference on System Sciences, Maui, HI, USA (2012)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Teodor Sommestad
    • 1
  • Jonas Hallberg
    • 1
  1. 1.Swedish Defence Research AgencyLinköpingSweden

Personalised recommendations