Advertisement

CASFinder: Detecting Common Attack Surface

  • Mengyuan Zhang
  • Yue Xin
  • Lingyu WangEmail author
  • Sushil Jajodia
  • Anoop Singhal
Conference paper
  • 512 Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11559)

Abstract

Code reusing is a common practice in software development due to its various benefits. Such a practice, however, may also cause large scale security issues since one vulnerability may appear in many different software due to cloned code fragments. The well known concept of relying on software diversity for security may also be compromised since seemingly different software may in fact share vulnerable code fragments. Although there exist efforts on detecting cloned code fragments, there lack solutions for formally characterizing their specific impact on security. In this paper, we revisit the concept of software diversity from a security viewpoint. Specifically, we define the novel concept of common attack surface to model the relative degree to which a pair of software may be sharing potentially vulnerable code fragments. To implement the concept, we develop an automated tool, CASFinder, in order to efficiently identify common attack surface between any given pair of software with minimum human intervention. Finally, we conduct experiments by applying our tool to real world open source software applications. Our results demonstrate many seemingly unrelated software applications indeed share significant common attack surface.

Notes

Acknowledgment

Authors with Concordia University are partially supported by the Natural Sciences and Engineering Research Council of Canada under Discovery Grant N01035. Sushil Jajodia was supported in part by the National Institute of Standards and Technology grants 60NANB16D287 and 60NANB18D168, National Science Foundation under grant IIP-1266147, Army Research Office under grant W911NF-13-1-0421, and Office of Naval Research under grant N00014-15-1-2007.

References

  1. 1.
    Interactive disassembler. https://www.hex-rays.com/products/ida/
  2. 2.
    Open hub (2017). https://www.openhub.net/
  3. 3.
    Baker, B.S.: A program for identifying duplicated code. Comput. Sci. Stat. 24, 49 (1993)Google Scholar
  4. 4.
    Baker, B.S.: On finding duplication and near-duplication in large software systems. In: Proceedings of 2nd Working Conference on Reverse Engineering, pp. 86–95. IEEE (1995)Google Scholar
  5. 5.
    Bartel, A., Klein, J., Le Traon, Y., Monperrus, M.: Automatically securing permission-based software by reducing the attack surface: an application to Android. In: Proceedings of the 27th IEEE/ACM International Conference on Automated Software Engineering, pp. 274–277. ACM (2012)Google Scholar
  6. 6.
    Basit, H.A., Jarzabek, S.: Efficient token based clone detection with flexible tokenization. In: Proceedings of the 6th Joint Meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering, pp. 513–516. ACM (2007)Google Scholar
  7. 7.
    Baxter, I.D., Yahin, A., Moura, L., Sant’Anna, M., Bier, L.: Clone detection using abstract syntax trees. In: 1998 Proceedings of International Conference on Software Maintenance, pp. 368–377. IEEE (1998)Google Scholar
  8. 8.
    Carvalho, M., DeMott, J., Ford, R., Wheeler, D.A.: Heartbleed 101. IEEE Secur. Privacy 12(4), 63–67 (2014)CrossRefGoogle Scholar
  9. 9.
    CVE Community. Common vulnerabilities and exposures (1999). https://cve.mitre.org/
  10. 10.
    Ducasse, S., Rieger, M., Demeyer, S.: A language independent approach for detecting duplicated code. In: Proceedings of IEEE International Conference on Software Maintenance, ICSM 1999, pp. 109–118. IEEE (1999)Google Scholar
  11. 11.
    Durumeric, Z., et al.: The matter of heartbleed. In: Proceedings of the 2014 Conference on Internet Measurement Conference, pp. 475–488. ACM (2014)Google Scholar
  12. 12.
    Ghaffarian, S.M., Shahriari, H.R.: Software vulnerability analysis and discovery using machine-learning and data-mining techniques: a survey. ACM Comput. Surv. (CSUR) 50(4), 56 (2017)CrossRefGoogle Scholar
  13. 13.
    Ghosh, A.K., Pendarakis, D., Sanders, W.H.: Moving target defense co-chair’s report-national cyber leap year summit 2009. Technical report, Federal Networking and Information Technology Research and Development (NITRD) Program (2009)Google Scholar
  14. 14.
    Gitchell, D., Tran, N.: Sim: a utility for detecting similarity in computer programs. In: ACM SIGCSE Bulletin, vol. 31, pp. 266–270. ACM (1999)Google Scholar
  15. 15.
    GitHub. Inc. A web-based hosting service for version control using Git. https://github.com
  16. 16.
    Göde, N., Koschke, R.: Incremental clone detection. In: 13th European Conference on Software Maintenance and Reengineering, CSMR 2009, pp. 219–228. IEEE (2009)Google Scholar
  17. 17.
    Howard, M., Pincus, J., Wing, J.: Measuring relative attack surfaces. In: Workshop on Advanced Developments in Software and Systems Security (2003)Google Scholar
  18. 18.
    Jajodia, S., Ghosh, A.K., Subrahmanian, V.S., Swarup, V., Wang, C., Wang, X.S.: Moving Target Defense II: Application of Game Theory and Adversarial Modeling. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-1-4614-5416-8CrossRefGoogle Scholar
  19. 19.
    Jajodia, S., Ghosh, A.K., Swarup, V., Wang, C., Wang, X.S.: Moving Target Defense: Creating Asymmetric Uncertainty for Cyber Threats, 1st edn. Springer, Heidelberg (2011).  https://doi.org/10.1007/978-1-4614-0977-9CrossRefGoogle Scholar
  20. 20.
    Jajodia, S., Ghosh, A.K., Swarup, V., Wang, C., Wang, X.S.: Moving Target Defense: Creating Asymmetric Uncertainty for Cyber Threats, vol. 54. Springer, Heidelberg (2011).  https://doi.org/10.1007/978-1-4614-0977-9CrossRefGoogle Scholar
  21. 21.
    Johnson, J.H.: Substring matching for clone detection and change tracking. In: ICSM, vol. 94, pp. 120–126 (1994)Google Scholar
  22. 22.
    Kamiya, T.: Tutorial of CLI tool ccfx (2008). http://www.ccfinder.net/doc/10.2/en/tutorial-ccfx.html
  23. 23.
    Kamiya, T., Kusumoto, S., Inoue, K.: CCFinder: a multilinguistic token-based code clone detection system for large scale source code. IEEE Trans. Softw. Eng. 28(7), 654–670 (2002)CrossRefGoogle Scholar
  24. 24.
    Kantola, D., Chin, E., He, W., Wagner, D.: Reducing attack surfaces for intra-application communication in Android. In: Proceedings of the Second ACM Workshop on Security and Privacy in Smartphones and Mobile Devices, pp. 69–80. ACM (2012)Google Scholar
  25. 25.
    Karp, R.M.: Combinatorics, complexity, and randomness. Commun. ACM 29(2), 97–109 (1986)MathSciNetCrossRefGoogle Scholar
  26. 26.
    Karp, R.M., Rabin, M.O.: Efficient randomized pattern-matching algorithms. IBM J. Res. Dev. 31(2), 249–260 (1987)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Komondoor, R., Horwitz, S.: Using slicing to identify duplication in source code. In: Cousot, P. (ed.) SAS 2001. LNCS, vol. 2126, pp. 40–56. Springer, Heidelberg (2001).  https://doi.org/10.1007/3-540-47764-0_3CrossRefGoogle Scholar
  28. 28.
    Kontogiannis, K., Galler, M., DeMori, R.: Detecting code similarity using patterns. In: Working Notes of 3rd Workshop on AI and Software Engineering, vol. 6 (1995)Google Scholar
  29. 29.
    Kontogiannis, K.A., DeMori, R., Merlo, E., Galler, M., Bernstein, M.: Pattern matching for clone and concept detection. Autom. Softw. Eng. 3(1–2), 77–108 (1996)Google Scholar
  30. 30.
    Koschke, R., Falke, R., Frenzel, P.: Clone detection using abstract syntax suffix trees. In: 13th Working Conference on Reverse Engineering, WCRE 2006, pp. 253–262. IEEE (2006)Google Scholar
  31. 31.
    Kurmus, A., et al.: Attack surface metrics and automated compile-time OS kernel tailoring. In: NDSS (2013)Google Scholar
  32. 32.
    Li, Z., Shan, L., Myagmar, S., Zhou, Y.: CP-miner: finding copy-paste and related bugs in large-scale software code. IEEE Trans. Softw. Eng. 32(3), 176–192 (2006)CrossRefGoogle Scholar
  33. 33.
    Manadhata, P., Wing, J.: An attack surface metric. Technical report CMU-CS-05-155 (2005)Google Scholar
  34. 34.
    Manadhata, P., Wing, J.: An attack surface metric. IEEE Trans. Softw. Eng. 37(3), 371–386 (2011)CrossRefGoogle Scholar
  35. 35.
    Manadhata, P., Wing, J.: Measuring a system’s attack surface. Technical report CMU-CS-04-102 (2004)Google Scholar
  36. 36.
    Manadhata, P.K., Wing, J.M.: An attack surface metric. IEEE Trans. Softw. Eng. 37(3), 371–386 (2011)CrossRefGoogle Scholar
  37. 37.
    Mayrand, J., Leblanc, C., Merlo, E.: Experiment on the automatic detection of function clones in a software system using metrics. In: ICSM, vol. 96, p. 244 (1996)Google Scholar
  38. 38.
  39. 39.
    Raza, A., Vogel, G., Plödereder, E.: Bauhaus – a tool suite for program analysis and reverse engineering. In: Pinho, L.M., González Harbour, M. (eds.) Ada-Europe 2006. LNCS, vol. 4006, pp. 71–82. Springer, Heidelberg (2006).  https://doi.org/10.1007/11767077_6CrossRefGoogle Scholar
  40. 40.
    Rothwell, T.: The GNU C reference manual (2006). https://www.gnu.org/software/gnu-c-manual/
  41. 41.
    Roy, C.K., Cordy, J.R., Koschke, R.: Comparison and evaluation of code clone detection techniques and tools: a qualitative approach. Sci. Comput. Program. 74(7), 470–495 (2009)MathSciNetCrossRefGoogle Scholar
  42. 42.
    Roy, C.K., Cordy, J.R.: A survey on software clone detection research. Queen’s Sch. Comput. TR 541(115), 64–68 (2007)Google Scholar
  43. 43.
    Syropoulos, A.: Mathematics of multisets. In: Calude, C.S., PĂun, G., Rozenberg, G., Salomaa, A. (eds.) WMC 2000. LNCS, vol. 2235, pp. 347–358. Springer, Heidelberg (2001).  https://doi.org/10.1007/3-540-45523-X_17CrossRefGoogle Scholar
  44. 44.
    Theisen, C., Herzig, K., Morrison, P., Murphy, B., Williams, L.: Approximating attack surfaces with stack traces. In: Proceedings of the 37th International Conference on Software Engineering, vol. 2, pp. 199–208. IEEE Press (2015)Google Scholar
  45. 45.
    Younis, A.A., Malaiya, Y.K.: Relationship between attack surface and vulnerability density: a case study on apache HTTP server. In: Proceedings on the International Conference on Internet Computing (ICOMP), p. 1. The Steering Committee of the World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp) (2012)Google Scholar
  46. 46.
    Younis, A.A., Malaiya, Y.K., Ray, I.: Using attack surface entry points and reachability analysis to assess the risk of software vulnerability exploitability. In: 2014 IEEE 15th International Symposium on High-Assurance Systems Engineering (HASE), pp. 1–8. IEEE (2014)Google Scholar
  47. 47.
    Zhang, M., Wang, L., Jajodia, S., Singhal, A., Albanese, M.: Network diversity: a security metric for evaluating the resilience of networks against zero-day attacks. IEEE Trans. Inf. Forensics Secur. 11(5), 1071–1086 (2016)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2019

Authors and Affiliations

  • Mengyuan Zhang
    • 1
  • Yue Xin
    • 1
  • Lingyu Wang
    • 1
    Email author
  • Sushil Jajodia
    • 2
  • Anoop Singhal
    • 3
  1. 1.Concordia Institute for Information Systems EngineeringConcordia UniversityMontrealCanada
  2. 2.Center for Secure Information SystemsGeorge Mason UniversityFairfaxUSA
  3. 3.Computer Security DivisionNational Institute of Standards and TechnologyGaithersburgUSA

Personalised recommendations