Analysis and Results of the 1999 DARPA Off-Line Intrusion Detection Evaluation
Eight sites participated in the second DARPA off-line intrusion detection evaluation in 1999. Three weeks of training and two weeks of test data were generated on a test bed that emulates a small government site. More than 200 instances of 58 attack types were launched against victim UNIX and Windows NT hosts. False alarm rates were low (less than 10 per day). Best detection was provided by network-based systems for old probe and old denial-of-service (DoS) attacks and by host-based systems for Solaris user-to-root (U2R) attacks. Best overall performance would have been provided by a combined system that used both host- and network-based intrusion detection. Detection accuracy was poor for previously unseen new, stealthy, and Windows NT attacks. Ten of the 58 attack types were completely missed by all systems. Systems missed attacks because protocols and TCP services were not analyzed at all or to the depth required, because signatures for old attacks did not generalize to new attacks, and because auditing was not available on all hosts.
KeywordsFalse Alarm Rate Intrusion Detection Intrusion Detection System Attack Trace Intrusion Detection Evaluation
Unable to display preview. Download preview PDF.
- 1.E. G. Amoroso, Intrusion Detection: An Introduction to Internet Surveillance, Correlation, Trace Back, Traps, and Response, Intrusion.Net Books, 1999.Google Scholar
- 2.K. Das, The Development of Stealthy Attacks to Evaluate Intrusion Detection Systems, S. M. Thesis, MIT Department of Electrical Engineering and Computer Science, June 2000.Google Scholar
- 4.A. K. Ghosh and A. Schwartzbard, A Study in Using Neural Networks for Anomaly and Misuse Detection, in Proceedings of the USENIX Security Symposium, August 23–26, 1999, Washington, D.C, http://www.rstcorp.com/~anup.
- 5.T. Heberlein, T., Network Security Monitor (NSM)-Final Report, U. C. Davis: February 1995, http://seclab.cs.ucdavis.edu/papers/NSM-final.pdf
- 6.K. Jackson, Intrusion Detection System (IDS) Product Survey, Los Alamos National Laboratory, Report LA-UR-99-3883, 1999.Google Scholar
- 7.S. Jajodia, D. Barbara, B. Speegle, and N. Wu, Audit Data Analysis and Mining (ADAM), project described in http://www.isse.gmu.edu/~dbarbara/adam.html, April, 2000.
- 8.K. Kendall, A Database of Computer Attacks for the Evaluation of Intrusion Detection Systems, S. M. Thesis, MIT Department of Electrical Engineering and Computer Science, June 1999.Google Scholar
- 9.J. Korba, Windows NT Attacks for the Evaluation of Intrusion Detection Systems, S. M. Thesis, MIT Department of Electrical Engineering and Computer Science, June 2000.Google Scholar
- 10.Lawrence Berkeley National Laboratory Network Research Group provides tcp-dump at http://www-nrg.ee.lbl.gov.
- 11.R. P. Lippmann, Joshua W. Haines, David J. Fried, Jonathan Korba, and Kumar Das, The 1999 DARPA Off-Line Intrusion Detection Evaluation, Computer Networks, In Press, 2000.Google Scholar
- 12.R. P. Lippmann, David J. Fried, Isaac Graf, Joshua W. Haines, Kristopher R. Kendall, David McClung, Dan Weber, Seth E. Webster, Dan Wyschogrod, Robert K. Cunningham, and Marc A. Zissman, Evaluating Intrusion Detection Systems: the 1998 DARPA Off-Line Intrusion Detection Evaluation, in Proceedings of the 2000 DARPA Information Survivability Conference and Exposition (DISCEX), Vol. 2, IEEE Press, January 2000.Google Scholar
- 13.R. P. Lippmann and R. K. Cunningham, Guide to Creating Stealthy Attacks for the 1999 DARPA Off-Line Intrusion Detection Evaluation, MIT Lincoln Laboratory Project Report IDDE-1, June 1999.Google Scholar
- 14.MIT Lincoln Laboratory, A public web site http://www.ll.mit.edu/IST/ideval/index.html, contains limited information on the 1998 and 1999 evaluations. Follow instructions on this web site or send email to the authors (rpl or firstname.lastname@example.org) to obtain access to a password-protected site with more complete information on these evaluations and results. Software scripts to execute attacks are not provided on these or other web sites.
- 15.P. Neumann and P. Porras, Experience with EMERALD to DATE, in Proceedings 1st USENIX Workshop on Intrusion Detection and Network Monitoring, Santa Clara, California, April 1999, 73–80, http://www.sdl.sri.com/emerald/index.html.
- 16.S. Northcutt, Network Intrusion Detection; An Analysis Handbook, New Riders Publishing, Indianapolis, 1999.Google Scholar
- 17.V. Paxson, “Empirically-Derived Analytic Models of Wide-Area TCP Connections”, IEEE/ACM Transactions on Networking, Vol. 2, No. 4, August, 1994, ftp://ftp.ee.lbl.gov/papers/WAN-TCP-models.ps.Z.
- 18.T. H. Ptacek and T. N. Newsham, Insertion, Evasion, and Denial of Service: Eluding Network Intrusion Detection, Secure Networks, Inc. Report, January 1998.Google Scholar
- 19.N. Puketza, M. Chung, R. A. Olsson, and B. Mukherjee, A Software Platform for Testing Intrusion Detection Systems, IEEE Software, September/October, 1997, 43–51.Google Scholar
- 20.A. Schwartzbard and A. K. Ghosh, A Study in the Feasibility of Performing Hostbased Anomaly Detection on Windows NT, in Proceedings of the 2nd Recent Advances in Intrusion Detection (RAID 1999) Workshop, West Lafayette, IN, September 7–9, 1999.Google Scholar
- 21.R. Sekar and P. Uppuluri, Synthesizing Fast Intrusion Prevention/Detection Systems from High-Level Specifications, in Proceedings 8th Usenix Security Symposium, Washington DC, Aug. 1999, http://rcs-sgi.cs.iastate.edu/sekar/abs/usenixsec99.htm.
- 22.M. Tyson, P. Berry, N. Williams, D. Moran, D. Blei, DERBI: Diagnosis, Explanation and Recovery from computer Break-Ins, project described in http://www.ai.sri.com/~derbi/, April. 2000.
- 23.G. Vigna, S. T. Eckmann, and R. A. Kemmerer, The STAT Tool Suite, in Proceedings of the 2000 DARPA Information Survivability Conference and Exposition (DISCEX), IEEE Press, January 2000.Google Scholar