User-Centric Security Assessment of Software Configurations: A Case Study

  • Hamza Ghani
  • Jesus Luna Garcia
  • Ivaylo Petkov
  • Neeraj Suri
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8364)


Software systems are invariably vulnerable to exploits, thus the need to assess their security in order to quantify the associated risk their usage entails. However, existing vulnerability assessment approaches e.g., vulnerability analyzers, have two major constraints: (a) they need the system to be already deployed to perform the analysis and, (b) they do not consider the criticality of the system within the business processes of the organization. As a result, many users, in particular small and medium-sized enterprizes are often unaware about assessing the actual technical and economical impact of vulnerability exploits in their own organizations, before the actual system’s deployment. Drawing upon threat modeling techniques (i.e., attack trees), we propose a user-centric methodology to quantitatively perform a software configuration’s security assessment based on (i) the expected economic impact associated with compromising the system’s security goals and, (ii) a method to rank available configurations with respect to security. This paper demonstrates the feasibility and usefulness of our approach in a real-world case study based on the Amazon EC2 service. Over 2000 publicly available Amazon Machine Images are analyzed and ranked with respect to a specific business profile, before deployment in the Amazon’s Cloud.


Cloud Security Economics of Security Security Metrics Security Quantification Vulnerability Assessment 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    NVD, National Vulnerability Database (2013),
  2. 2.
    OSVDB, The Open Source Vulnerability Database (2012),
  3. 3.
    OpenVAS, Open Vulnerability Assessment System (2013),
  4. 4.
    Tenable Network Security, Nessus vulnerability scanner (2013),
  5. 5.
    Fruehwirth, C., et al.: Improving CVSS-based vulnerability prioritization and response with context. In: Proc. of Third International Symposium on Empirical Software Engineering and Measurement (2009)Google Scholar
  6. 6.
    Ishiguro, M., et al.: The effect of information security incidents on corporate values in the japanese stock market. In: Proc. of International Workshop on the Economics of Securing the Information Infrastructure, WESII (2006)Google Scholar
  7. 7.
    Telang, R., et al.: An empirical analysis of the impact of software vulnerability announcements on firm stock price. Proc. of IEEE Transactions on Software Engineering (2007)Google Scholar
  8. 8.
    Lai, Y., et al.: Using the vulnerability information of computer systems to improve the network security. Computer Communications (2007)Google Scholar
  9. 9.
    Saaty, T.: Book: The Analytic Hierarchy Process. McGraw-Hill, New York (1980)Google Scholar
  10. 10.
    Triantaphyllou, E.: The impact of aggregating benefit and cost criteria in four mcda methods. IEEE Transactions on Engineering Management (2004)Google Scholar
  11. 11.
    Balduzzi, M., et al.: A security analysis of Amazon’s Elastic Compute Cloud service. In: Proc. of the Annual ACM Symposium on Applied Computing (2012)Google Scholar
  12. 12.
    Schneier, B.: Attack trees. Dr Dobb’s 24(12) (1999),
  13. 13.
    Swiderski, F., Snyder, W.: Book: Threat Modeling. Microsoft Press (2004)Google Scholar
  14. 14.
    Department of Homeland Security, Attack Patterns (2009),
  15. 15.
    SHIELDS, EU FP 7 – SHIELDS project: Detecting known security vulnerabilities from within design and development tools (2010),
  16. 16.
    RPM ORG, The RPM package manager (2007),
  17. 17.
    Ghani, H., et al.: Predictive vulnerability scoring in the context of insufficient information availability. In: Proc. of the Intl. Conference on Risk and Security of Internet and Systems, CRiSIS (2013)Google Scholar
  18. 18.
    Forum of Incident Response and Security Teams, CVSS – Common Vulnerability Scoring System (2012),
  19. 19.
    Luna, J., et al.: Privacy-by-design based on quantitative threat modeling. In: Proc. of the Intl. Conference on Risk and Security of Internet and Systems (2012)Google Scholar
  20. 20.
    Luna, J., et al.: Benchmarking Cloud Security Level Agreements Using Quantitative Policy Trees. In: Proc. of the ACM Cloud Computing Security Workshop (2012)Google Scholar
  21. 21.
    Symantec, Ponemon Institute, Data Breach Calculator (2013),
  22. 22.
    Innerhofer, F., et al.: An empirically derived loss taxonomy based on publicly known security incidents. In: Proc. of Intl. Conf. on Availability, Reliability and Security, ARES (2009)Google Scholar
  23. 23.
    Van Eeten, M., et al.: Damages from internet security incidents. OPTA Research reports (2009),
  24. 24.
    Ghani, H., et al.: Quantitative assessment of software vulnerabilities based on economic-driven security metrics. In: Proc. of the Intl. Conference on Risk and Security of Internet and Systems, CRiSIS (2013)Google Scholar
  25. 25.
    Forum of Incident Response and Security Teams, CVSS Adopters (2013),
  26. 26.
    Scarfone, K., Mell, P.: An analysis of CVSS version 2 vulnerability scoring. In: Intl. Symposium on Empirical Software Engineering and Measurement, ESEM (2009)Google Scholar
  27. 27.
    Saaty, T.: Book: Fundamentals of decision making and priority theory with the analytic hierarchy process. RWS Publications, Pittsburgh (1994)Google Scholar
  28. 28.
    Zeleny, M.: Book: Multiple Criteria Decision Making. McGraw-Hill (1982)Google Scholar
  29. 29.
    NIST, CPE – Official Common Platform Enumeration Dictionary (2013),
  30. 30.
    SANS-Institute, SANS critical vulnerability analysis archive (2007),
  31. 31.
  32. 32.
    Microsoft, Microsoft security response center - security bulletin severity rating system (2007),,
  33. 33.
    Mell, P., et al.: Common vulnerability scoring system. IEEE Security and Privacy 4, 85–89 (2006)CrossRefGoogle Scholar
  34. 34.
    Rieke, R.: Modelling and analysing network security policies in a given vulnerability setting. Critical Information Infrastructures Security (2006)Google Scholar
  35. 35.
    Eschelbeck, G.: The laws of vulnerabilities: Which security vulnerabilities really matter. Information Security Technical Report (2005)Google Scholar
  36. 36.
    Chen, Y.: Stakeholder value driven threat modeling for off the shelf based systems (2007)Google Scholar
  37. 37.
    Liu, N., et al.: Security assessment for communication networks of power control systems using attack graph and mcdm. IEEE Transactions on Power Delivery (2010)Google Scholar
  38. 38.
    Ni, M., et al.: Online risk-based security assessment. IEEE Transactions on Power Systems (2003)Google Scholar
  39. 39.
    Rezmierski, V., et al.: Incident cost analysis and modeling project (i-camp). Technical Report, Higher Education Information Security Council, HEISC (2000)Google Scholar
  40. 40.
    Allied World Assurance, Tech404 Data Loss Cost Calculator (2013),
  41. 41.
    Anderson, R., et al.: Measuring the cost of cybercrime. In: Proc. of Workshop on the Economics of Information Security, WEIS (2012)Google Scholar
  42. 42.
    Detica and C. Office, The cost of cyber crime: joint government and industry report. In: Detica Report (2012),

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Hamza Ghani
    • 1
  • Jesus Luna Garcia
    • 1
  • Ivaylo Petkov
    • 1
  • Neeraj Suri
    • 1
  1. 1.Technische Universität DarmstadtGermany

Personalised recommendations