A Comparative Evaluation of Anomaly Detectors under Portscan Attacks

  • Ayesha Binte Ashfaq
  • Maria Joseph Robert
  • Asma Mumtaz
  • Muhammad Qasim Ali
  • Ali Sajjad
  • Syed Ali Khayam
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5230)


Since the seminal 1998/1999 DARPA evaluations of intrusion detection systems, network attacks have evolved considerably. In particular, after the CodeRed worm of 2001, the volume and sophistication of self-propagating malicious code threats have been increasing at an alarming rate. Many anomaly detectors have been proposed, especially in the past few years, to combat these new and emerging network attacks. At this time, it is important to evaluate existing anomaly detectors to determine and learn from their strengths and shortcomings. In this paper, we evaluate the performance of eight prominent network-based anomaly detectors under malicious portscan attacks. These ADSs are evaluated on four criteria: accuracy (ROC curves), scalability (with respect to varying normal and attack traffic rates, and deployment points), complexity (CPU and memory requirements during training and classification,) and detection delay. These criteria are evaluated using two independently collected datasets with complementary strengths. Our results show that a few of the anomaly detectors provide high accuracy on one of the two datasets, but are unable to scale their accuracy across the datasets. Based on our experiments, we identify promising guidelines to improve the accuracy and scalability of existing and future anomaly detectors.


False Alarm Maximum Entropy False Alarm Rate Intrusion Detection Anomaly Detector 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Symantec Internet Security Threat Reports I–XI (January 2002–January 2008)Google Scholar
  2. 2.
    McAfee Corp., McAfee Virtual Criminology Report: North American Study into Organized Crime and the Internet (2005)Google Scholar
  3. 3.
    Computer Economics: Economic Impact of Malicious Code Attacks (2001),
  4. 4.
    Williamson, M.M.: Throttling viruses: Restricting propagation to defeat malicious mobile code. In: ACSAC (2002)Google Scholar
  5. 5.
    Twycross, J., Williamson, M.M.: Implementing and testing a virus throttle. In: Usenix Security (2003)Google Scholar
  6. 6.
    Sellke, S., Shroff, N.B., Bagchi, S.: Modeling and automated containment of worms. In: DSN (2005)Google Scholar
  7. 7.
    Jung, J., Paxson, V., Berger, A.W., Balakrishnan, H.: Fast portscan detection using sequential hypothesis testing. In: IEEE Symp. Sec. and Priv. (2004)Google Scholar
  8. 8.
    Schechter, S.E., Jung, J., Berger, A.W.: Fast detection of scanning worm infections. In: Jonsson, E., Valdes, A., Almgren, M. (eds.) RAID 2004. LNCS, vol. 3224, pp. 59–81. Springer, Heidelberg (2004)Google Scholar
  9. 9.
    Weaver, N., Staniford, S., Paxson, V.: Very fast containment of scanning worms. In: Usenix Security (2004)Google Scholar
  10. 10.
    Chen, S., Tang, Y.: Slowing Down Internet Worms. In: IEEE ICDCS (2004)Google Scholar
  11. 11.
    Ganger, G., Economou, G., Bielski, S.: Self-Securing Network Interfaces: What, Why, and How. Carnegie Mellon University Technical Report, CMU-CS-02-144 (2002)Google Scholar
  12. 12.
    Mahoney, M.V., Chan, P.K.: PHAD: Packet Header Anomaly Detection for Indentifying Hostile Network Traffic. Florida Tech. technical report CS-2001-4 (2001)Google Scholar
  13. 13.
    Mahoney, M.V., Chan, P.K.: Learning Models of Network Traffic for Detecting Novel Attacks. Florida Tech. technical report CS-2002-08 (2002)Google Scholar
  14. 14.
    Mahoney, M.V., Chan, P.K.: Network Traffic Anomaly Detection Based on Packet Bytes. In: ACM SAC (2003)Google Scholar
  15. 15.
    Lakhina, A., Crovella, M., Diot, C.: Characterization of network-wide traffic anomalies in traffic flows. In: ACM Internet Measurement Conference (IMC) (2004)Google Scholar
  16. 16.
    Lakhina, A., Crovella, M., Diot, C.: Diagnosing network-wide traffic anomalies. In: ACM SIGCOMM (2004)Google Scholar
  17. 17.
    Lakhina, A., Crovella, M., Diot, C.: Mining anomalies using traffic feature distributions. In: ACM SIGCOMM (2005)Google Scholar
  18. 18.
    Soule, A., Salamatian, K., Taft, N.: Combining Filtering and Statistical methods for anomaly detection. In: ACM/Usenix IMC (2005)Google Scholar
  19. 19.
    Zou, C.C., Gao, L., Gong, W., Towsley, D.: Monitoring and early warning of Internet worms. In: ACM CCS (2003)Google Scholar
  20. 20.
    Gu, Y., McCullum, A., Towsley, D.: Detecting anomalies in network traffic using maximum entropy estimation. In: ACM/Usenix IMC (2005)Google Scholar
  21. 21.
    Next-Generation Intrusion Detection Expert System (NIDES),
  22. 22.
  23. 23.
    Cisco IOS Flexible Network Flow,
  24. 24.
    LBNL/ICSI Enterprise Tracing Project,
  25. 25.
    WisNet ADS Comparison Homepage,
  26. 26.
    Wong, C., Bielski, S., Studer, A., Wang, C.: Empirical Analysis of Rate Limiting Mechanisms. In: Valdes, A., Zamboni, D. (eds.) RAID 2005. LNCS, vol. 3858, pp. 22–42. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  27. 27.
    Shafiq, M.Z., Khayam, S.A., Farooq, M.: Improving Accuracy of Immune-inspired Malware Detectors by using Intelligent Features. In: ACM GECCO (2008)Google Scholar
  28. 28.
    Ingham, K.L., Inoue, H.: Comparing Anomaly Detection Techniques for HTTP. In: Kruegel, C., Lippmann, R., Clark, A. (eds.) RAID 2007. LNCS, vol. 4637, pp. 42–62. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  29. 29.
    Lazarevic, A., Ertoz, L., Kumar, V., Ozgur, A., Srivastava, J.: A Comparative Study of Anomaly Detection Schemes in Network Intrusion Detection. In: SIAM SDM (2003)Google Scholar
  30. 30.
    Mueller, P., Shipley, G.: Dragon claws its way to the top. In: Network Computing (2001),
  31. 31.
    The NSS Group: Intrusion Detection Systems Group Test (Edition 2) (2001),
  32. 32.
    Yocom, B., Brown, K.: Intrusion battleground evolves, Network World Fusion (2001),
  33. 33.
    Lippmann, R.P., Haines, J.W., Fried, D.J., Korba, J., Das, K.: The 1999 DARPA OffLine Intrusion Detection Evaluation. Comp. Networks 34(2), 579–595 (2000)CrossRefGoogle Scholar
  34. 34.
    Durst, R., Champion, T., Witten, B., Miller, E., Spagnuolo, L.: Testing and Evaluating Computer Intrusion Detection Systems. Comm. of the ACM 42(7), 53–61 (1999)CrossRefGoogle Scholar
  35. 35.
    Shipley, G.: ISS RealSecure Pushes Past Newer IDS Players. In: Network Computing (1999),
  36. 36.
    Shipley, G.: Intrusion Detection, Take Two. In: Network Computing (1999),
  37. 37.
    Roesch, M.: Snort – Lightweight Intrusion Detection for Networks. In: USENIX LISA (1999)Google Scholar
  38. 38.
    Lippmann, R.P., Fried, D.J., Graf, I., Haines, J.W., Kendall, K.R., McClung, D., Weber, D., Webster, S.E., Wyschogrod, D., Cunningham, R.K., Zissman, M.A.: Evaluating Intrusion Detection Systems: The 1998 DARPA Off-Line Intrusion Detection Evaluation. In: DISCEX, vol. (2), pp. 12–26 (2000)Google Scholar
  39. 39.
    DARPA-sponsored IDS Evaluation (1998 and 1999). MIT Lincoln Lab, Cambridge,
  40. 40.
    Debar, H., Dacier, M., Wespi, A., Lampart, S.: A workbench for intrusion detection systems. IBM Zurich Research Laboratory (1998)Google Scholar
  41. 41.
    Denmac Systems, Inc.: Network Based Intrusion Detection: A Review of Technologies (1999)Google Scholar
  42. 42.
    Ptacek, T.H., Newsham, T.N.: Insertion, Evasion, and Denial of Service: Eluding Network Intrusion Detection. Secure Networks, Inc. (1998)Google Scholar
  43. 43.
    Aguirre, S.J., Hill, W.H.: Intrusion Detection Fly-Off: Implications for the United States Navy. MITRE Technical Report MTR 97W096 (1997)Google Scholar
  44. 44.
    Puketza, N., Chung, M., Olsson, R.A., Mukherjee, B.: A Software Platform for Testing Intrusion Detection Systems. IEEE Software 14(5), 43–51 (1997)CrossRefGoogle Scholar
  45. 45.
    Puketza, N.F., Zhang, K., Chung, M., Mukherjee, B., Olsson, R.A.: A Methodology for Testing Intrusion Detection Systems. IEEE Trans. Soft. Eng. 10(22), 719–729 (1996)CrossRefGoogle Scholar
  46. 46.
    Mell, P., Hu, V., Lippmann, R., Haines, J., Zissman, M.: An Overview of Issues in Testing Intrusion Detection Systems. NIST IR 7007 (2003)Google Scholar
  47. 47.
    McHugh, J.: The 1998 Lincoln Laboratory IDS Evaluation (A Critique). In: Debar, H., Mé, L., Wu, S.F. (eds.) RAID 2000. LNCS, vol. 1907, Springer, Heidelberg (2000)CrossRefGoogle Scholar
  48. 48.
    Mahoney, M.V., Chan, P.K.: An Analysis of the 1999 DARPA/Lincoln Laboratory Evaluation Data for Network Anomaly Detection. In: Vigna, G., Krügel, C., Jonsson, E. (eds.) RAID 2003. LNCS, vol. 2820, pp. 220–237. Springer, Heidelberg (2003)Google Scholar
  49. 49.
    Pang, R., Allman, M., Paxson, V., Lee, J.: The Devil and Packet Trace Anonymization. In: ACM CCR, vol. 36(1) (2006)Google Scholar
  50. 50.
    Pang, R., Allman, M., Bennett, M., Lee, J., Paxson, V., Tierney, B.: A First Look at Modern Enterprise Traffic. In: ACM/USENIX IMC (2005)Google Scholar
  51. 51.
    Winpcap homepage,
  52. 52.
  53. 53.
    Shannon, C., Moore, D.: The spread of the Witty worm. IEEE Sec & Priv 2(4), 46–50 (2004)CrossRefGoogle Scholar
  54. 54.
    Axelsson, S.: Intrusion Detection Systems: A Survey and Taxonomy. Technical Report 99-15, Chalmers University (2000)Google Scholar
  55. 55.
    Ringberg, H., Rexford, J., Soule, A., Diot, C.: Sensitivity of PCA for Traffic Anomaly Detection. In: ACM SIGMETRICS (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Ayesha Binte Ashfaq
    • 1
  • Maria Joseph Robert
    • 1
  • Asma Mumtaz
    • 1
  • Muhammad Qasim Ali
    • 1
  • Ali Sajjad
    • 1
  • Syed Ali Khayam
    • 1
  1. 1.School of Electrical Engineering & Computer ScienceNational University of Sciences & Technology (NUST)RawalpindiPakistan

Personalised recommendations