State-of-the-Art: Security Competition in Talent Education

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10726)

Abstract

Security competitions have become increasingly popular events for recruitment, training, evaluation, and recreation in the field of computer security. And among these various exercises, Capture the flag (CTF) competitions have the widest audience. Participants in CTF of Jeopardy style focus on solving several specific challenges independently while participants in CTF of attack-defense mode concentrate on vulnerable service maintenance and vulnerability exploitation on an end-target box. However, according to a report published by TREND MICRO Corporation, there are six stages of a typical Targeted Attack: (1) Intelligence Gathering (2) Point of Entry (3) Command and Control Communication (4) Lateral Movement (5) Asset Discovery and (6) Data Exfiltration. Further, Lateral Movement is the key stage where threat actors move deeper into the network. Because of the lack of large-scale complex network environment, CTF cannot simulate a complete network penetration of the six stages, especially the Lateral Movement. It is indispensable to perform the Lateral Movement the skill of Network Exploring which is not included by security competitions at present. So we create Explore-Exploit which is an attack-defense mode competition that models the network penetration scenario, and promotes the participant’s skill of Network Exploring. This paper is trying to convey a better methodology for teaching practical attack-defense techniques to participants through an alternative to CTF.

Keywords

Security competition Talent education Network penetration scenario Attack-defense mode Explore-Exploit 

References

  1. 1.
    Paulsen, C., McDuffie, E., Newhouse, W., Toth, P.: Nice: creating a cybersecurity workforce and aware public. IEEE Secur. Privacy 10, 76–79 (2012)CrossRefGoogle Scholar
  2. 2.
    NIST: Nice Cybersecurity Workforce Framework, draft NIST Special Publication 800–181. http://csrc.nist.gov/publications/drafts/800-181/sp800_181_draft.pdf
  3. 3.
    O’Neil, L.R., Assante, M., Tobey, D.: Smart grid cybersecurity: Job performance model report. Technical report. Pacific Northwest National Laboratory (PNNL), Richland, WA (US) (2012)Google Scholar
  4. 4.
  5. 5.
  6. 6.
  7. 7.
    Petullo, W.M., Moses, K., Klimkowski, B., Hand, R., Olson, K.: The use of cyber-defense exercises in undergraduate computing education. In: ASE@ USENIX Security Symposium (2016)Google Scholar
  8. 8.
  9. 9.
    Lufeng, Z., Hong, T., YiMing, C., JianBo, Z.: Network security evaluation through attack graph generation (2009)Google Scholar
  10. 10.
    Fraze, M.D.: Cyber Grand Challenge (CGC). https://www.darpa.mil/program/cyber-grand-challenge
  11. 11.
    Shoshitaishvili, Y., Invernizzi, L., Doupe, A., Vigna, G.: Do you feel lucky?: a large-scale analysis of risk-rewards trade-offs in cyber security. In: Proceedings of the 29th Annual ACM Symposium on Applied Computing, pp. 1649–1656. ACM (2014)Google Scholar
  12. 12.
    Hadnagy, M.F.C.: The def con 22 social-engineer capture the flagreport. https://www.social-engineer.org/wp-content/uploads/2014/10/SocialEngineerCaptureTheFlag_DEFCON22-2014.pdf
  13. 13.
    Doupé, A., Vigna, G.: Poster: Shell we play a game? CTF-as-a-service for security educationGoogle Scholar
  14. 14.
  15. 15.
  16. 16.
  17. 17.
    Deterding, S., Dixon, D., Khaled, R., Nacke, L.: From game design elements to gamefulness: defining gamification. In: Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments, pp. 9–15. ACM (2011)Google Scholar
  18. 18.
    Ruef, A., Hicks, M., Parker, J., Levin, D., Memon, A., Plane, J., Mardziel, P.: Build it break it: measuring and comparing development security. In: 8th Workshop on Cyber Security Experimentation and Test (CSET 2015) (2015)Google Scholar
  19. 19.
    Childers, N., Boe, B., Cavallaro, L., Cavedon, L., Cova, M., Egele, M., Vigna, G.: Organizing large scale hacking competitions. In: Kreibich, C., Jahnke, M. (eds.) DIMVA 2010. LNCS, vol. 6201, pp. 132–152. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-14215-4_8 CrossRefGoogle Scholar
  20. 20.
  21. 21.
  22. 22.
  23. 23.
    Vigna, G., Borgolte, K., Corbetta, J., Doupe, A., Fratantonio, Y., Invernizzi, L., Kirat, D., Shoshitaishvili, Y.: Ten years of ICTF: The good, the bad, and the ugly. In: 2014 USENIX Summit on Gaming, Games, and Gamification in Security Education (3GSE 2014) (2014)Google Scholar
  24. 24.
  25. 25.
    Connolly, C.: The cyber defense review. Technical report, vol. 1(1). Army Cyber Inst, West Point, NY, Spring 2016Google Scholar
  26. 26.
  27. 27.
  28. 28.
    Ou, X., Boyer, W.F., McQueen, M.A.: A scalable approach to attack graph generation. In: Proceedings of the 13th ACM Conference on Computer and Communications Security, pp. 336–345. ACM (2006)Google Scholar
  29. 29.
    The world’s most used penetration testing software. https://www.metasploit.com/
  30. 30.

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Xiu Zhang
    • 1
    • 2
  • Baoxu Liu
    • 1
  • Xiaorui Gong
    • 1
  • Zhenyu Song
    • 1
  1. 1.Institute of Information EngineeringChinese Academy of SciencesBeijingChina
  2. 2.School of Cyber SecurityUniversity of Chinese Academy of SciencesBeijingChina

Personalised recommendations