Advertisement

Two Architectural Threat Analysis Techniques Compared

  • Katja Tuma
  • Riccardo Scandariato
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11048)

Abstract

In an initial attempt to systematize the research field of architectural threat analysis, this paper presents a comparative study of two threat analysis techniques. In particular, the controlled experiment presented here compares two variants of Microsoft’s STRIDE. The two variants differ in the way the analysis is performed. In one case, each component of the software system is considered in isolation and scrutinized for potential security threats. In the other case, the analysis has a wider scope and considers the security threats that might occur in a pair of interacting software components. The study compares the techniques with respect to their effectiveness in finding security threats (benefits) as well as the time that it takes to perform the analysis (cost). We also look into other human aspects which are important for industrial adoption, like, for instance, the perceived difficulty in learning and applying the techniques as well as the overall preference of our experimental participants.

Keywords

Empirical study Secure software Threat analysis STRIDE 

References

  1. 1.
    Empirical study: Threat modeling. https://sites.google.com/site/empiricalstudythreatanalysis/. Accessed 25 Aug 2017
  2. 2.
    Abe, T., Hayashi, S., Saeki, M.: Modeling security threat patterns to derive negative scenarios. In: 2013 20th Asia-Pacific Software Engineering Conference (APSEC), vol. 1, pp. 58–66. IEEE (2013)Google Scholar
  3. 3.
    Carver, J., Jaccheri, L., Morasca, S., Shull, F.: Issues in using students in empirical studies in software engineering education. In: Proceedings of Ninth International Software Metrics Symposium, pp. 239–249. IEEE (2003)Google Scholar
  4. 4.
    Deng, M., Wuyts, K., Scandariato, R., Preneel, B., Joosen, W.: A privacy threat analysis framework: supporting the elicitation and fulfillment of privacy requirements. Requir. Eng. 16(1), 3–32 (2011)CrossRefGoogle Scholar
  5. 5.
    Diallo, M.H., Romero-Mariona, J., Sim, S.E., Alspaugh, T.A., Richardson, D.J.: A comparative evaluation of three approaches to specifying security requirements. In: 12th Working Conference on Requirements Engineering: Foundation for Software Quality, Luxembourg (2006)Google Scholar
  6. 6.
    Höst, M., Regnell, B., Wohlin, C.: Using students as subjectsa comparative study of students and professionals in lead-time impact assessment. Empir. Softw. Eng. 5(3), 201–214 (2000)CrossRefGoogle Scholar
  7. 7.
    Howard, M., Lipner, S.: The Security Development Lifecycle, vol. 8. Microsoft Press, Redmond (2006)Google Scholar
  8. 8.
    Karpati, P., Opdahl, A.L., Sindre, G.: Experimental comparison of misuse case maps with misuse cases and system architecture diagrams for eliciting security vulnerabilities and mitigations. In: 2011 Sixth International Conference on Availability, Reliability and Security (ARES), pp. 507–514. IEEE (2011)Google Scholar
  9. 9.
    Karpati, P., Sindre, G., Matulevicius, R.: Comparing misuse case and mal-activity diagrams for modelling social engineering attacks. Int. J. Secure Softw. Eng. (IJSSE) 3(2), 54–73 (2012)CrossRefGoogle Scholar
  10. 10.
    Labunets, K., Massacci, F., Paci, F., et al.: An experimental comparison of two risk-based security methods. In: 2013 ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, pp. 163–172. IEEE (2013)Google Scholar
  11. 11.
    Lund, M.S., Solhaug, B., Stølen, K.: A guided tour of the CORAS method. In: Lund, M.S., Solhaug, B., Stølen, K. (eds.) Model-Driven Risk Analysis, pp. 23–43. Springer, Heidelberg (2011).  https://doi.org/10.1007/978-3-642-12323-8_3CrossRefzbMATHGoogle Scholar
  12. 12.
    McGraw, G., Migues, S., West, J.: Building security in maturity model (BSIMM). https://www.bsimm.com. Accessed 25 Aug 2017
  13. 13.
    Meland, P.H., Tøndel, I.A., Jensen, J.: Idea: reusability of threat models – two approaches with an experimental evaluation. In: Massacci, F., Wallach, D., Zannone, N. (eds.) ESSoS 2010. LNCS, vol. 5965, pp. 114–122. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-11747-3_9CrossRefGoogle Scholar
  14. 14.
    Opdahl, A.L., Sindre, G.: Experimental comparison of attack trees and misuse cases for security threat identification. Inf. Softw. Technol. 51(5), 916–932 (2009)CrossRefGoogle Scholar
  15. 15.
    Runeson, P.: Using students as experiment subjects-an analysis on graduate and freshmen student data. In: Proceedings of the 7th International Conference on Empirical Assessment in Software Engineering, pp. 95–102 (2003)Google Scholar
  16. 16.
    Saitta, P., Larcom, B., Eddington, M.: Trike v. 1 methodology document [draft]. http://dymaxion.org/trike/Trike_v1_Methodology_Documentdraft.pdf
  17. 17.
    Salman, I., Misirli, A.T., Juristo, N.: Are students representatives of professionals in software engineering experiments? In: Proceedings of the 37th International Conference on Software Engineering, vol. 1, pp. 666–676. IEEE Press (2015)Google Scholar
  18. 18.
    Scandariato, R., Wuyts, K., Joosen, W.: A descriptive study of microsofts threat modeling technique. Requir. Eng. 20(2), 163–180 (2015)CrossRefGoogle Scholar
  19. 19.
    Schneier, B.: Attack trees. Dr Dobb’s J. 24(12), 21–29 (1999)Google Scholar
  20. 20.
    Shostack, A.: Threat Modeling: Designing for Security. Wiley, Hoboken (2014)Google Scholar
  21. 21.
    Stoneburner, G., Hayden, C., Feringa, A.: Engineering principles for information technology security (a baseline for achieving security). Technical report, Booz-Allen and Hamilton Inc., Mclean, VA (2001)Google Scholar
  22. 22.
    Torr, P.: Demystifying the threat modeling process. IEEE Secur. Priv. 3(5), 66–70 (2005)CrossRefGoogle Scholar
  23. 23.
    Tuma, K., Scandariato, R., Widman, M., Sandberg, C.: Towards security threats that matter. In: Katsikas, S.K., et al. (eds.) CyberICPS/SECPRE -2017. LNCS, vol. 10683, pp. 47–62. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-72817-9_4CrossRefGoogle Scholar
  24. 24.
    UcedaVelez, T., Morana, M.M.: Risk Centric Threat Modeling: Process for Attack Simulation and Threat Analysis. Wiley, Hoboken (2015)CrossRefGoogle Scholar
  25. 25.
    Wohlin, C., Höst, M., Henningsson, K.: Empirical research methods in software engineering. In: Conradi, R., Wang, A.I. (eds.) Empirical Methods and Studies in Software Engineering. LNCS, vol. 2765, pp. 7–23. Springer, Heidelberg (2003).  https://doi.org/10.1007/978-3-540-45143-3_2CrossRefGoogle Scholar
  26. 26.
    Wuyts, K., Scandariato, R., Joosen, W.: Empirical evaluation of a privacy-focused threat modeling methodology. J. Syst. Softw. 96, 122–138 (2014)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Chalmers University of TechnologyGothenburgSweden
  2. 2.University of GothenburgGothenburgSweden

Personalised recommendations