Redesign and Implementation of Evaluation Dataset for Intrusion Detection System
Although the intrusion detection system industry is rapidly maturing, the state of intrusion detection system evaluation is not. The off-line dataset evaluation proposed by MIT Lincoln Lab is a practical solution in terms of evaluating the performance of IDS. While the evaluation dataset represents a significant and monumental undertaking, there remain several issues unsolved in the design and modeling of the resulting dataset which may make the evaluation results biased. Some researchers have noticed such problems and criticized the design and execution of the dataset, but there is no technical contribution for new efforts proposed per se. In this paper we present our efforts to redesign and generate new dataset. We first study how network applications and user behaviors characterize the network traffic. Second, we apply ourselves to improve on the background traffic simulation (including HTTP, SMTP, POP, P2P, FTP and other types of traffic). Unlike the existing model, our model simulates traffic from user level rather than from packet level, which is more reasonable for background traffic modeling and simulation. Our model takes advantage of user-level web mining, automatic user profiling and Enron email dataset etc. The high fidelity of simulated background traffic is shown in experiment. Moreover, different kinds of attacker personalities are profiled and more than 300 instances of 62 different automated attacks are launched against victim hosts and servers. All our efforts try to make the dataset more “real” and therefore be fairer for IDS evaluation.
KeywordsIntrusion Detection User Profile Intrusion Detection System Attack Personality Reference Network
Unable to display preview. Download preview PDF.
- 1.Cole, R.A., Mariani, J., Uszkoreit, H., Zaenen, A., Zue, V.: Survey of the State of the Art in Human Language Technology. In: Center for Spoken Language Understanding CSLU, Carnegie Mellon University, Pittsburgh, PA (1995)Google Scholar
- 2.Lippmann, R.P., Fried, D.J., Graf, I., Haines, J.W., Kendall, K.R., McClung, D., Weber, D., Webster, S.E., Wyschogrod, D., Cunningham, R.K., Zissman, M.A.: Evaluating Intrusion Detection Systems: The 1998 DARPA Off-Line Intrusion Detection Evaluation. In: Proc. of the 2000 DARPA Information Survivability Conference and Exposition (DISCEX), Los Alamitos, CA, vol. 2, pp. 12–26 (2000)Google Scholar
- 4.Ning, P., Cui, Y.: An intrusion alert correlator based on prerequisites of intrusions. Technical Report TR-2002-01, Department of Computer Science, North Carolina State University (January 2002)Google Scholar
- 5.Matthew, V.M., Philip, K.C.: Learning nonstationary models of normal network traffic for detecting novel attacks. In: Proc. of the 8th ACM SIGKDD international conference on Knowledge discovery and data mining, Edmonton, Alberta, Canada (July 2002)Google Scholar
- 6.Sekar, R., Gupta, A., Frullo, J., Shanbhag, T., Tiwari, A., Yang, H., Zhou, S.: Specification Based Anomaly Detection: A New Approach for Detecting Network Intrusions. In: ACM CCS 2002, Washington DC, USA (November 2002)Google Scholar
- 7.Lee, W., Stolfo, S.: A Framework for Constructing Features and Models for Intrusion Detection Systems. ACM Transactions on Information and System Security 3(4) (2000)Google Scholar
- 12.Debar, H., Dacier, M., Wespi, A., Lampart, S.: An Experimentation Workbench for Intrusion Detection Systems. Technical Report No.RZ2998, IBM Zurich Research Laboratory (September 1998)Google Scholar
- 13.Song, D., Shaffer, G., Undy, M.: Nidsbench: a Network Intrusion Detection Test Suite. In: Proc. of Recent Advances in Intrusion Detection(RAID 1999) (September 1999)Google Scholar
- 16.Costales, B.: Sendmail, 3rd edn. O’Reilly, Sebastopol (December 2002)Google Scholar