Advertisement

Automatically Adapting a Trained Anomaly Detector to Software Patches

  • Peng Li
  • Debin Gao
  • Michael K. Reiter
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5758)

Abstract

In order to detect a compromise of a running process based on it deviating from its program’s normal system-call behavior, an anomaly detector must first be trained with traces of system calls made by the program when provided clean inputs. When a patch for the monitored program is released, however, the system call behavior of the new version might differ from that of the version it replaces, rendering the anomaly detector too inaccurate for monitoring the new version. In this paper we explore an alternative to collecting traces of the new program version in a clean environment (which may take effort to set up), namely adapting the anomaly detector to accommodate the differences between the old and new program versions. We demonstrate that this adaptation is feasible for such an anomaly detector, given the output of a state-of-the-art binary difference analyzer. Our analysis includes both proofs of properties of the adapted detector, and empirical evaluation of adapted detectors based on four software case studies.

Keywords

Anomaly detection software patches system-call monitoring binary difference analysis 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aho, A.V., Sethi, R., Ullman, J.D.: Compilers: Principles, Techniques, and Tools. Addison-Wesley, Reading (1986)MATHGoogle Scholar
  2. 2.
    Basu, S., Uppuluri, P.: Proxi-annotated control flow graphs: Deterministic context-sensitive monitoring for intrusion detection, pp. 353–362. Springer, Heidelberg (2004)Google Scholar
  3. 3.
    Buchanan, E., Roemer, R., Schacham, H., Savage, S.: When good instructions go bad: Generalizing return-oriented programming to RISC. In: Proceedings of the 15th ACM Conference on Computer and Communications Security (October 2008)Google Scholar
  4. 4.
    Cohn, R.S., Goodwin, D.W., Lowney, P.G.: Optimizing Alpha executables on Windows NT with Spike. Digital Tech. J. 9, 3–20 (1998)Google Scholar
  5. 5.
    Feng, H., Giffin, J., Huang, Y., Jha, S., Lee, W., Miller, B.: Formalizing sensitivity in static analysis for intrusion detection. In: Proceedings of the 2004 IEEE Symposium on Security and Privacy (May 2004)Google Scholar
  6. 6.
    Feng, H., Kolesnikov, O., Fogla, P., Lee, W., Gong, W.: Anomaly detection using call stack information. In: Proceedings of the 2003 IEEE Symposium on Security and Privacy, May 2003, pp. 62–75 (2003)Google Scholar
  7. 7.
    Forrest, S., Hofmeyr, S., Somayaji, A., Longstaff, T.: A sense of self for Unix processes. In: Proceedings of the 1996 IEEE Symposium on Security and Privacy, May 1996, pp. 120–128 (1996)Google Scholar
  8. 8.
    Gao, D., Reiter, M.K., Song, D.: Gray-box extraction of execution graph for anomaly detection. In: Proceedings of the 11th ACM Conference on Computer & Communication Security (CCS 2004) (2004)Google Scholar
  9. 9.
    Gao, D., Reiter, M.K., Song, D.: On gray-box program tracking for anomaly detection. In: Proceedings of the 13th USENIX Security Symposium (2004)Google Scholar
  10. 10.
    Gao, D., Reiter, M.K., Song, D.: BinHunt: Automatically finding semantic differences in binary programs. In: Chen, L., Ryan, M.D., Wang, G. (eds.) ICICS 2008. LNCS, vol. 5308, pp. 238–255. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  11. 11.
    Giffin, J., Jha, S., Miller, B.: Detecting manipulated remote call streams. In: Proceedings of the 11th USENIX Security Symposium (August 2002)Google Scholar
  12. 12.
    Giffin, J., Jha, S., Miller, B.: Efficient context-sensitive intrusion detection. In: Proceedings of the ISOC Symposium on Network and Distributed System Security (February 2004)Google Scholar
  13. 13.
    Gopalakkrishna, R., Spafford, E.H., Vitek, J.: Efficient intrusion detection using automaton inlining. In: Proceedings of the 2005 Symposium on Security and Privacy, pp. 18–31 (2005)Google Scholar
  14. 14.
    Hofmeyr, S.A., Forrest, S., Somayaji, A.: Intrusion detection using sequences of system calls. Journal of Computer Security, 151–180 (1998)Google Scholar
  15. 15.
    Sekar, R., Bendre, M., Dhurjati, D., Bollineni, P.: A fast automaton-based method for detecting anomalous program behaviors. In: Proceedings of the 2001 IEEE Symposium on Security and Privacy, May 2001, pp. 144–155 (2001)Google Scholar
  16. 16.
    Tan, K., Maxion, R.: “Why 6?”– Defining the operational limits of stide, an anomaly-based intrusion detector. In: Proceedings of the 2002 IEEE Symposium on Security and Privacy, May 2002, pp. 188–201 (2002)Google Scholar
  17. 17.
    Tan, K., McHugh, J., Killourhy, K.: Hiding intrusions: From the abnormal to the normal and beyond. In: Petitcolas, F.A.P. (ed.) IH 2002. LNCS, vol. 2578, pp. 1–17. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  18. 18.
    Wagner, D., Dean, D.: Intrusion detection via static analysis. In: Proceedings of the 2001 IEEE Symposium on Security and Privacy (May 2001)Google Scholar
  19. 19.
    Wagner, D., Soto, P.: Mimicry attacks on host-based intrusion detection systems. In: Proceedings of the 9th ACM Conference on Computer and Communications Security (2002)Google Scholar
  20. 20.
    Wang, Z., Piece, K., Mcfarling, S.: BMAT – a binary matching tool for stale profile propagation. The Journal of Instruction-Level Parallelism 2(2000) (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Peng Li
    • 1
  • Debin Gao
    • 2
  • Michael K. Reiter
    • 1
  1. 1.Department of Computer ScienceUniversity of North CarolinaChapel HillUSA
  2. 2.School of Information SystemsSingapore Management UniversitySingapore

Personalised recommendations