Advertisement

Anomaly Detector Performance Evaluation Using a Parameterized Environment

  • Jeffery P. Hansen
  • Kymie M. C. Tan
  • Roy A. Maxion
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4219)

Abstract

Over the years, intrusion detection has matured into a field replete with anomaly detectors of various types. These detectors are tasked with detecting computer-based attacks, insider threats, worms and more. Their abundance easily prompts the question – is anomaly detection improving in efficacy and reliability? Current evaluation strategies may provide answers; however, they suffer from problems. For example, they produce results that are only valid within the evaluation data set and they provide very little by way of diagnostic information to tune detector performance in a principled manner.

This paper studies the problem of acquiring reliable performance results for an anomaly detector. Aspects of a data environment that will affect detector performance, such as the frequency distribution of data elements, are identified, characterized and used to construct a synthetic data environment to assess a frequency-based anomaly detector. In a series of experiments that systematically maps out the detector’s performance, areas of detection weaknesses are exposed, and strengths are identified. Finally, the extensibility of the lessons learned in the synthetic environment are observed using real-world data.

Keywords

anomaly detection performance modeling IDS evaluation tuning 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Anderson, D., Lunt, T.F., Javitz, H., Tamaru, A., Valdes, A.: Detecting Unusual Program Behavior Using the Statistical Component of the Next-Generation Intrusion Detection Expert System (NIDES). Technical Report SRI-CSL-95-06, Computer Science Laboratory, SRI International (May 1995)Google Scholar
  2. 2.
    Anderson, D., Lunt, T.F., Javitz, H., Tamaru, A., Valdes, A.: Safeguard Final Report: Detecting Unusual Program Behavior Using the NIDES Statistical Component. Technical Report, Computer Science Laboratory, SRI International, Menlo Park, California (December 02, 1993)Google Scholar
  3. 3.
    Arbel, Gil.: Anomaly Detection Falls Short TechWorld (March 13, 2006), http://www.techworld.com/networking/features/index.cfm?featureID=2331
  4. 4.
    Denning, D.E.: An Intrusion-Detection Model. IEEE Transactions on Software Engineering SE-13(2), 222–232 (1987)CrossRefGoogle Scholar
  5. 5.
    Forrest, S.: Computer Immune Systems. Data sets for sequence-based intrusion detection: Computer Science Department, University of New Mexico, Albuquerque, New Mexico (2006), http://www.cs.unm.edu/~immsec/systemcalls.htm
  6. 6.
    Forrest, S., Hofmeyr, S.A., Somayaji, A., Longstaff, T.A.: A Sense of Self for Unix Processes. In: IEEE Symposium on Security and Privacy, Oakland, California, May 06-08, 1996, pp. 120–128. IEEE Computer Society Press, Los Alamitos (1996)Google Scholar
  7. 7.
    Ghosh, A.K., Schwartzbart, A., Schatz, M.: Learning Program Behavior Profiles for Intrusion Detection. In: 1st USENIX Workshop on Intrusion Detection and Network Monitoring, Santa Clara, pp. 51–62 (1999)Google Scholar
  8. 8.
    Ghosh, A.K., Wanken, J., Charron, F.: Detecting Anomalous and Unknown Intrusions Against Programs. In: 14th Annual Computer Security Applications Conference, Phoenix, AZ, December 07-11,1998, pp. 259–267. IEEE Computer Society Press, Los Alamitos (1998)Google Scholar
  9. 9.
    Javitz, H.S., Valdes, A.: The NIDES Statistical Component: Description and Justification. Annual Report A010, March 07, 1994, SRI International, Menlo Park, California (1994)Google Scholar
  10. 10.
    Javitz, H.S., Valdes, A.: The SRI IDES Statistical Anomaly Detector. In: IEEE Symposium on Research in Security and Privacy, Oakland, California, May 20-22, 1991, pp. 316–326. IEEE Computer Security Press, Los Alamitos (1991)Google Scholar
  11. 11.
    Jha, S., Tan, K.M.C., Maxion, R.A.: Markov Chains, Classifiers, and Intrusion Detection. In: 14th IEEE Computer Security Foundations Workshop, Cape Breton, Nova Scotia, Canada, June 11-13, 2001, pp. 206–219 (2001)Google Scholar
  12. 12.
    Swets, J.A., Pickett, R.M.: Evaluation of Diagnostic Systems: Methods from Signal Detection Theory. Academic Press, New York (1982)Google Scholar
  13. 13.
    Tan, K.M.C., Maxion, R.A.: Determining the Operational Limits of an Anomaly-Based Intrusion Detector. IEEE Journal on Selected Areas in Communications, Special Issue on Design and Analysis Techniques for Security Assurance 21(1), 96–110 (2003)Google Scholar
  14. 14.
    Tan, K.M.C., Maxion, R.A.: The Effects of Algorithmic Diversity on Anomaly Detector Performance. In: International Conference on Dependable Systems & Networks (DSN 2005), Yokohama, Japan, June 28-July 01, 2005, pp. 216–225. IEEE Computer Society Press, Los Alamitos (2005)CrossRefGoogle Scholar
  15. 15.
    Valdes, A., Anderson, D.: Statistical Methods for Computer Usage Anomaly Detection Using NIDES (Next-Generation Intrusion Detection Expert System). In: Third International Workshop on Rough Sets and Soft Computing (RSSC 1994), San Jose, California, November 10-12, 1994, pp. 104–111, San Diego (1995)Google Scholar
  16. 16.
    Warrender, C., Forrest, S., Pearlmutter, B.: Detecting Intrusions Using System Calls: Alternative Data Models. In: IEEE Symposium on Security and Privacy, Oakland, California, May 09-12, 1999, pp. 133–145. IEEE Computer Society Press, Los Alamitos (1999)Google Scholar
  17. 17.
    Zissman, M.: 1998/99 DARPA Intrusion Detection Evaluation data sets, MIT Lincoln Laboratory, http://www.ll.mit.edu/IST/ideval/data/data_index.html

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Jeffery P. Hansen
    • 1
  • Kymie M. C. Tan
    • 1
  • Roy A. Maxion
    • 1
  1. 1.Carnegie Mellon UniversityPittsburghUSA

Personalised recommendations