Advertisement

Ghost Patches: Fake Patches for Fake Vulnerabilities

  • Jeffrey AveryEmail author
  • Eugene H. Spafford
Conference paper
Part of the IFIP Advances in Information and Communication Technology book series (IFIPAICT, volume 502)

Abstract

Offensive and defensive players in the cyber security sphere constantly react to either party’s actions. This reactive approach works well for attackers but can be devastating for defenders. This approach also models the software security patching lifecycle. Patches fix security flaws, but when deployed, can be used to develop malicious exploits.

To make exploit generation using patches more resource intensive, we propose inserting deception into software security patches. These ghost patches mislead attackers with deception and fix legitimate flaws in code. An adversary using ghost patches to develop exploits will be forced to use additional resources. We implement a proof of concept for ghost patches and evaluate their impact on program analysis and runtime. We find that these patches have a statistically significant impact on dynamic analysis runtime, increasing time to analyze by a factor of up to 14x, but do not have a statistically significant impact on program runtime.

Keywords

Conditional Statement Symbolic Execution Security Flaw Static Analysis Tool Vulnerable System 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgements

The authors would like to thank the anonymous reviewers for their comments and suggestions and specially thank Breanne N. Wright, Christopher N. Gutierrez, Oyindamola D. Oluwatimi and Scott A. Carr for their time, discussion and ideas. The National Science Foundation supported this research under award number 1548114. All ideas presented and conclusions or recommendations provided in this document are solely those of the authors.

References

  1. 1.
    Almeshekah, M.H., Spafford, E.H.: Planning and integrating deception into computer security defenses. In: Proceedings of the 2014 Workshop on New Security Paradigms Workshop, pp. 127–138. ACM (2014)Google Scholar
  2. 2.
    Araujo, F., Hamlen, K.W., Biedermann, S., Katzenbeisser, S.: From patches to honey-patches: lightweight attacker misdirection, deception, and disinformation. In: Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security, pp. 942–953. ACM (2014)Google Scholar
  3. 3.
    Araujo, F., Shapouri, M., Pandey, S., Hamlen, K.: Experiences with honey-patching in active cyber security education. In: 8th Workshop on Cyber Security Experimentation and Test (CSET 2015) (2015)Google Scholar
  4. 4.
    Bashar, M.A., Krishnan, G., Kuhn, M.G., Spafford, E.H., Wäġstäff, Jr., S.S.: Low-threat security patches and tools. In: International Conference on Software Maintenance, pp. 306–313. IEEE (1997)Google Scholar
  5. 5.
    Bowen, B.M., Hershkop, S., Keromytis, A.D., Stolfo, S.J.: Baiting inside attackers using decoy documents. In: Chen, Y., Dimitriou, T.D., Zhou, J. (eds.) SecureComm 2009. LNICSSITE, vol. 19, pp. 51–70. Springer, Heidelberg (2009). doi: 10.1007/978-3-642-05284-2_4 CrossRefGoogle Scholar
  6. 6.
    Brumley, D., Poosankam, P., Song, D., Zheng, J.: Automatic patch-based exploit generation is possible: techniques and implications. In: IEEE Symposium on Security and Privacy, pp. 143–157, May 2008Google Scholar
  7. 7.
    Cadar, C., Dunbar, D., Engler, D.R., et al.: KLEE: unassisted and automatic generation of high-coverage tests for complex systems programs. OSDI 8, 209–224 (2008)Google Scholar
  8. 8.
    Collberg, C., Nagra, J.: Surreptitious Software. Addision-Wesley Professional, Upper Saddle River (2010)Google Scholar
  9. 9.
    Collberg, C.S., Thomborson, C.: Watermarking, tamper-proofing, and obfuscation-tools for software protection. IEEE Trans. Softw. Eng. 28(8), 735–746 (2002)CrossRefGoogle Scholar
  10. 10.
    Coppens, B., De Sutter, B., De Bosschere, K.: Protecting your software updates. IEEE Secur. Priv. 11(2), 47–54 (2013)CrossRefGoogle Scholar
  11. 11.
    Crane, S., Larsen, P., Brunthaler, S., Franz, M.: Booby trapping software. In: Proceedings of the 2013 Workshop on New Security Paradigms Workshop, pp. 95–106. ACM (2013)Google Scholar
  12. 12.
    Dewdey, A.K.: Computer recreations, a core war bestiary of virus, worms and other threats to computer memories. Sci. Am. 252, 14–23 (1985)CrossRefGoogle Scholar
  13. 13.
    Dolan, S.: mov is turing-complete. Technical report 2013 (cit. on p. 153) (2013)Google Scholar
  14. 14.
    Kanzaki, Y., Monden, A., Collberg, C.: Code artificiality: a metric for the code stealth based on an n-gram model. In: Proceedings of the 1st International Workshop on Software Protection, pp. 31–37. IEEE Press (2015)Google Scholar
  15. 15.
    Lattner, C.: The LLVM Compiler Infrastructure. University of Illinois, Urbana-Campaign (2017)Google Scholar
  16. 16.
    Mitnick, K.D., Simon, W.L.: The Art of Deception: Controlling the Human Element of Security. Wiley, New York (2011)Google Scholar
  17. 17.
    Moser, A., Kruegel, C., Kirda, E.: Limits of static analysis for malware detection. In: Twenty-Third Annual Computer Security Applications Conference, 2007. ACSAC 2007, pp. 421–430. IEEE (2007)Google Scholar
  18. 18.
    Mosher’s Law of Engineering: Top 50 Programming Quotes of All Time. TechSource (2010)Google Scholar
  19. 19.
    Oh, J.: Fight against 1-day exploits: diffing binaries vs anti-diffing binaries. Black Hat (2009)Google Scholar
  20. 20.
    Ben Salem, M., Stolfo, S.J.: Decoy document deployment for effective masquerade attack detection. In: Holz, T., Bos, H. (eds.) DIMVA 2011. LNCS, vol. 6739, pp. 35–54. Springer, Heidelberg (2011). doi: 10.1007/978-3-642-22424-9_3 CrossRefGoogle Scholar
  21. 21.
    Spafford, E.: More than passive defense. CERIAS (2011)Google Scholar
  22. 22.
    Stoll, C.: The Cuckoo’s Egg: Tracking a Spy Through the Maze of Computer Espionage. Simon and Schuster, New York (2005)Google Scholar
  23. 23.
    Symantec: Internet security threat report. Technical report, Symantec (2016)Google Scholar
  24. 24.
    Udupa, S.K., Debray, S.K., Madou, M.: Deobfuscation: reverse engineering obfuscated code. In: 12th Working Conference on Reverse Engineering, p. 10. IEEE (2005)Google Scholar
  25. 25.
    Wang, C., Suo, S.: The practical defending of malicious reverse engineering. University of Gothenburg (2015)Google Scholar
  26. 26.
    Whaley, B.: Toward a general theory of deception. J. Strateg. Stud. 5(1), 178–192 (1982)CrossRefGoogle Scholar
  27. 27.
    Yokoyama, A., et al.: SandPrint: fingerprinting malware sandboxes to provide intelligence for sandbox evasion. In: Monrose, F., Dacier, M., Blanc, G., Garcia-Alfaro, J. (eds.) RAID 2016. LNCS, vol. 9854, pp. 165–187. Springer, Cham (2016). doi: 10.1007/978-3-319-45719-2_8 CrossRefGoogle Scholar
  28. 28.
    Yuill, J.J.: Defensive computer-security deception operations: processes, principles and techniques. ProQuest (2006)Google Scholar
  29. 29.
    Yuill, J., Zappe, M., Denning, D., Feer, F.: Honeyfiles: deceptive files for intrusion detection. In: Proceedings from the Fifth Annual IEEE SMC Information Assurance Workshop, pp. 116–122. IEEE (2004)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2017

Authors and Affiliations

  1. 1.Computer Science Department and CERIASPurdue UniversityWest LafayetteUSA

Personalised recommendations