Advertisement

Some Vulnerabilities Are Different Than Others

Studying Vulnerabilities and Attack Surfaces in the Wild
  • Kartik Nayak
  • Daniel Marino
  • Petros Efstathopoulos
  • Tudor Dumitraş
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8688)

Abstract

The security of deployed and actively used systems is a moving target, influenced by factors not captured in the existing security metrics. For example, the count and severity of vulnerabilities in source code, as well as the corresponding attack surface, are commonly used as measures of a software product’s security. But these measures do not provide a full picture. For instance, some vulnerabilities are never exploited in the wild, partly due to security technologies that make exploiting them difficult. As for attack surface, its effectiveness has not been validated empirically in the deployment environment. We introduce several security metrics derived from field data that help to complete the picture. They include the count of vulnerabilities exploited and the size of the attack surface actually exercised in real-world attacks. By evaluating these metrics on nearly 300 million reports of intrusion-protection telemetry, collected on more than six million hosts, we conduct an empirical study of security in the deployment environment. We find that none of the products in our study have more than 35% of their disclosed vulnerabilities exploited in the wild. Furthermore, the exploitation ratio and the exercised attack surface tend to decrease with newer product releases. We also find that hosts that quickly upgrade to newer product versions tend to have reduced exercised attack-surfaces. The metrics proposed enable a more complete assessment of the security posture of enterprise infrastructure. Additionally, they open up new research directions for improving security by focusing on the vulnerabilities and attacks that have the highest impact in practice.

Keywords

Security Technology Exploitation Ratio Window Vista Deployment Environment Attack Surface 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Shin, Y., Meneely, A., Williams, L., Osborne, J.A.: Evaluating complexity, code churn, and developer activity metrics as indicators of software vulnerabilities. IEEE Trans. Software Eng. 37(6), 772–787 (2011)CrossRefGoogle Scholar
  2. 2.
    Zimmermann, T., Nagappan, N., Williams, L.A.: Searching for a needle in a haystack: Predicting security vulnerabilities for windows vista. In: ICST, pp. 421–428 (2010)Google Scholar
  3. 3.
    National Vulnerability Database, http://nvd.nist.gov/
  4. 4.
    Howard, M., Pincus, J., Wing, J.M.: Measuring relative attack surfaces. In: Workshop on Advanced Developments in Software and Systems Security, Taipei, Taiwan (December 2003)Google Scholar
  5. 5.
    Manadhata, P.K., Wing, J.M.: An attack surface metric. IEEE Trans. Software Eng. 37(3), 371–386 (2011)CrossRefGoogle Scholar
  6. 6.
    Microsoft Corp.: Microsoft Attack Surface Analyzer - Beta, http://bit.ly/A04NNO
  7. 7.
    Coverity: Coverity scan: 2011 open source integrity report (2011)Google Scholar
  8. 8.
    National Institute of Standards and Technology: National Vulnerability database, http://nvd.nist.gov
  9. 9.
    Microsoft Corp.: A history of Windows, http://bit.ly/RKDHIm
  10. 10.
    Wikipedia: Source lines of code, http://bit.ly/5LkKx
  11. 11.
    TechRepublic: Five super-secret features in Windows 7, http://tek.io/g3rBrB
  12. 12.
    Rescorla, E.: Is finding security holes a good idea? IEEE Security & Privacy 3(1), 14–19 (2005)CrossRefGoogle Scholar
  13. 13.
    Ozment, A., Schechter, S.E.: Milk or wine: Does software security improve with age? In: Proceedings of the 15th Conference on USENIX Security Symposium, USENIX-SS 2006, vol. 15. USENIX Association, Berkeley (2006)Google Scholar
  14. 14.
    Clark, S., Frei, S., Blaze, M., Smith, J.: Familiarity breeds contempt: The honeymoon effect and the role of legacy code in zero-day vulnerabilities. In: Proceedings of the 26th Annual Computer Security Applications Conference, ACSAC 2010, pp. 251–260. ACM, New York (2010)Google Scholar
  15. 15.
    Bozorgi, M., Saul, L.K., Savage, S., Voelker, G.M.: Beyond heuristics: learning to classify vulnerabilities and predict exploits. In: KDD, Washington, DC (July 2010)Google Scholar
  16. 16.
    Quinn, S., Scarfone, K., Barrett, M., Johnson, C.: Guide to adopting and using the security content automation protocol (SCAP) version 1.0. NIST Special Publication 800-117 (July 2010)Google Scholar
  17. 17.
    Ransbotham, S.: An empirical analysis of exploitation attempts based on vulnerabilities in open source software (2010)Google Scholar
  18. 18.
    Kurmus, A., Tartler, R., Dorneanu, D., Heinloth, B., Rothberg, V., Ruprecht, A., Schröder-Preikschat, W., Lohmann, D., Kapitza, R.: Attack surface metrics and automated compile-time os kernel tailoring. In: Network and Distributed System Security (NDSS) Symposium, San Diego, CA (February 2013)Google Scholar
  19. 19.
    Allodi, L., Massacci, F.: A preliminary analysis of vulnerability scores for attacks in wild. In: CCS BADGERS Workshop, Raleigh, NC (October 2012)Google Scholar
  20. 20.
    Allodi, L.: Attacker economics for internet-scale vulnerability risk assessment. In: Proceedings of Usenix LEET Workshop (2013)Google Scholar
  21. 21.
    Symantec Corporation: A-Z listing of threats and risks, http://bit.ly/11G7JE5
  22. 22.
    Symantec Corporation: Attack signatures, http://bit.ly/xQaOQr
  23. 23.
    Open Sourced Vulnerability Database, http://www.osvdb.org
  24. 24.
    Symantec Attack Signatures, http://bit.ly/1hCw1TL
  25. 25.
    Dumitraş, T., Shou, D.: Toward a standard benchmark for computer security research: The worldwide intelligence network environment (wine). In: Proceedings of the First Workshop on Building Analysis Datasets and Gathering Experience Returns for Security, BADGERS 2011, pp. 89–96. ACM, New York (2011)CrossRefGoogle Scholar
  26. 26.
    Information about Internet Explorer versions, http://bit.ly/1oNMA97
  27. 27.
    National Institute of Standards and Technology: Engineering statistics handbook, http://www.itl.nist.gov/div898/handbook/index.htm
  28. 28.
    Bilge, L., Dumitraş, T.: Before we knew it: An empirical study of zero-day attacks in the real world. In: ACM Conference on Computer and Communications Security, Raleigh, NC, pp. 833–844 (October 2012)Google Scholar
  29. 29.
  30. 30.
  31. 31.
    Krebs, B.: Crimeware author funds exploit buying spree (2013), http://bit.ly/1mYwlUY
  32. 32.
    FireEye: The Dual Use Exploit: CVE-2013-3906 Used in Both Targeted Attacks and Crimeware Campaigns (2013), http://bit.ly/R3XQQ4
  33. 33.
    A Note about the DHTML Editing Control in IE7+, http://blogs.msdn.com/b/ie/archive/2006/06/27/648850.aspx

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Kartik Nayak
    • 1
  • Daniel Marino
    • 2
  • Petros Efstathopoulos
    • 2
  • Tudor Dumitraş
    • 1
  1. 1.University of MarylandCollege ParkUSA
  2. 2.Symantec Research LabsUSA

Personalised recommendations