Redesign and Implementation of Evaluation Dataset for Intrusion Detection System

  • Jun Qian
  • Chao Xu
  • Meilin Shi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3995)


Although the intrusion detection system industry is rapidly maturing, the state of intrusion detection system evaluation is not. The off-line dataset evaluation proposed by MIT Lincoln Lab is a practical solution in terms of evaluating the performance of IDS. While the evaluation dataset represents a significant and monumental undertaking, there remain several issues unsolved in the design and modeling of the resulting dataset which may make the evaluation results biased. Some researchers have noticed such problems and criticized the design and execution of the dataset, but there is no technical contribution for new efforts proposed per se. In this paper we present our efforts to redesign and generate new dataset. We first study how network applications and user behaviors characterize the network traffic. Second, we apply ourselves to improve on the background traffic simulation (including HTTP, SMTP, POP, P2P, FTP and other types of traffic). Unlike the existing model, our model simulates traffic from user level rather than from packet level, which is more reasonable for background traffic modeling and simulation. Our model takes advantage of user-level web mining, automatic user profiling and Enron email dataset etc. The high fidelity of simulated background traffic is shown in experiment. Moreover, different kinds of attacker personalities are profiled and more than 300 instances of 62 different automated attacks are launched against victim hosts and servers. All our efforts try to make the dataset more “real” and therefore be fairer for IDS evaluation.


Intrusion Detection User Profile Intrusion Detection System Attack Personality Reference Network 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Cole, R.A., Mariani, J., Uszkoreit, H., Zaenen, A., Zue, V.: Survey of the State of the Art in Human Language Technology. In: Center for Spoken Language Understanding CSLU, Carnegie Mellon University, Pittsburgh, PA (1995)Google Scholar
  2. 2.
    Lippmann, R.P., Fried, D.J., Graf, I., Haines, J.W., Kendall, K.R., McClung, D., Weber, D., Webster, S.E., Wyschogrod, D., Cunningham, R.K., Zissman, M.A.: Evaluating Intrusion Detection Systems: The 1998 DARPA Off-Line Intrusion Detection Evaluation. In: Proc. of the 2000 DARPA Information Survivability Conference and Exposition (DISCEX), Los Alamitos, CA, vol. 2, pp. 12–26 (2000)Google Scholar
  3. 3.
    Lippmann, R.P., Haines, J.W., Fried, D.J., Korba, J., Das, K.: The 1999 DARPA Offline Intrusion Detection Evaluation. Computer Networks 34(2), 579–595 (2000)CrossRefGoogle Scholar
  4. 4.
    Ning, P., Cui, Y.: An intrusion alert correlator based on prerequisites of intrusions. Technical Report TR-2002-01, Department of Computer Science, North Carolina State University (January 2002)Google Scholar
  5. 5.
    Matthew, V.M., Philip, K.C.: Learning nonstationary models of normal network traffic for detecting novel attacks. In: Proc. of the 8th ACM SIGKDD international conference on Knowledge discovery and data mining, Edmonton, Alberta, Canada (July 2002)Google Scholar
  6. 6.
    Sekar, R., Gupta, A., Frullo, J., Shanbhag, T., Tiwari, A., Yang, H., Zhou, S.: Specification Based Anomaly Detection: A New Approach for Detecting Network Intrusions. In: ACM CCS 2002, Washington DC, USA (November 2002)Google Scholar
  7. 7.
    Lee, W., Stolfo, S.: A Framework for Constructing Features and Models for Intrusion Detection Systems. ACM Transactions on Information and System Security 3(4) (2000)Google Scholar
  8. 8.
    Wang, K., Stolfo, S.J.: Anomalous Payload-Based Network Intrusion Detection. In: Jonsson, E., Valdes, A., Almgren, M. (eds.) RAID 2004. LNCS, vol. 3224, pp. 203–222. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  9. 9.
    McHugh, J.: Testing intrusion detection systems: A critique of the 1998 and 1999 DARPA intrusion detection system evaluations as performed by Lincoln Laboratory. ACM Transactions on Information and System Security 3(4), 262–294 (2000)CrossRefGoogle Scholar
  10. 10.
    Mahoney, M.V., Chan, P.K.: An Analysis of the 1999 DARPA/Lincoln Laboratory Evaluation Data for Network Anomaly Detection. In: Vigna, G., Krügel, C., Jonsson, E. (eds.) RAID 2003. LNCS, vol. 2820, pp. 220–237. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  11. 11.
    Puketza, N., Chung, M., Olsson, R.A., Mukherjee, B.: A Software Platform for Testing Intrusion Detection Systems. IEEE Software 14(5), 43–51 (1997)CrossRefGoogle Scholar
  12. 12.
    Debar, H., Dacier, M., Wespi, A., Lampart, S.: An Experimentation Workbench for Intrusion Detection Systems. Technical Report No.RZ2998, IBM Zurich Research Laboratory (September 1998)Google Scholar
  13. 13.
    Song, D., Shaffer, G., Undy, M.: Nidsbench: a Network Intrusion Detection Test Suite. In: Proc. of Recent Advances in Intrusion Detection(RAID 1999) (September 1999)Google Scholar
  14. 14.
  15. 15.
  16. 16.
    Costales, B.: Sendmail, 3rd edn. O’Reilly, Sebastopol (December 2002)Google Scholar
  17. 17.
    Klimt, B., Yang, Y.: The enron corpus: A new dataset for email classification research. In: Boulicaut, J.-F., et al. (eds.) ECML 2004. LNCS, vol. 3201, pp. 217–226. Springer, Heidelberg (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Jun Qian
    • 1
  • Chao Xu
    • 1
  • Meilin Shi
    • 1
  1. 1.Department of Computer ScienceTsinghua UniversityBeijingP.R. China

Personalised recommendations