Analyzing the traffic of penetration testing tools with an IDS

  • Fernando Román Muñoz
  • Esteban Alejandro Armas Vega
  • Luis Javier García Villalba
Article
  • 189 Downloads

Abstract

Many papers have been published comparing the accuracy of automated tools in looking for vulnerabilities in web applications. In those previous studies the researchers analyze vulnerable web applications with pentesting tools and then the reports that automated tools generate are compared to each other. The aim of this work is not only to know the detection capabilities of tools, but also to know what tests are performed, which vulnerabilities they try to detect and which really has the web application. This way it can be known whether the tests carried out by automated tools are efficient and meet two important aspects of the analysis tools: the automated tool has to try to detect all vulnerabilities in the web applications if it has a feature to do it; and also they have to report all vulnerabilities that they detect.

Keywords

Automatic scanner tools Cybersecurity Web vulnerabilities 

Notes

Acknowledgements

This work was funded by the European Commission Horizon 2020 Programme under Grant Agreement Number H2020-FCT-2015/700326-RAMSES (Internet Forensic Platform for Tracking the Money Flow of Financially-Motivated Malware).

References

  1. 1.
    Sagala A, Manurung E (2015) Testing and comparing result scanning using web vulnerability scanner. Adv Sci Lett 21(11):3458–3462CrossRefGoogle Scholar
  2. 2.
    Nidhra S, Dondeti J (2012) Blackbox and whitebox testing techniques—a literature review. Int J Embed Syst Appl (IJESA) 2(2):29–50Google Scholar
  3. 3.
    Makino Y, Klyuev V (2015) Evaluation of web vulnerability scanners. In: Proceedings of the IEEE 8th International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications (IDAACS), vol 1, Warsaw, PL, pp 399–402Google Scholar
  4. 4.
    Bau J, Bursztein E, Gupta D, Mitchell J (2010) State of the art: automated black-box web application vulnerability testing. In: Proceedings of the 2010 IEEE Symposium on Security and Privacy, SP ’10, IEEE Computer Society, Washington, DC, USA, pp 332–345Google Scholar
  5. 5.
    Baral P (2011) Web application scanners: a review of related articles. IEEE Potentials 30(2):10–14CrossRefGoogle Scholar
  6. 6.
    The Open Web Application Security Project OWASP (2016) OWASP Zed Attack Proxy Project. https://www.owasp.org/index.php/OWASP_Zed_Attack_Proxy_Project. Accessed 29 April 2016
  7. 7.
    Google (2016) Google Code—Skipfish. https://code.google.com/archive/p/skipfish/. Accessed 22 March 2016
  8. 8.
    RandomStorm (2016) Damn Vulnerable Web Application (DVWA). http://www.dvwa.co.uk. Accessed 22 March 2016
  9. 9.
    Google (2016) Google Code—WAVSEP. https://code.google.com/archive/p/wavsep/. Accessed 29 April 2016
  10. 10.
    Saeed FA (2014) Using WASSEC to analysis and evaluate open source web application security scanners. Int J Comput Sci Netw 3(2):43–49Google Scholar
  11. 11.
    Web Application Security Consortium (2016) Web application security scanner evaluation criteria WASSEC. http://goo.gl/aePtyC. Accessed 22 March 2016
  12. 12.
    W3af (2016) W3af—open source web application security scanner. http://w3af.org. Accessed 29 April 2016
  13. 13.
    Saeed FA (2014) Using WASSEC to evaluate commercial web application security scanners. Int J Soft Comput Eng (IJSCE) 4(1):177–181Google Scholar
  14. 14.
    Acunetix (2015) Web vulnerability scanner v10 product manual. Product Manual, AcunetixGoogle Scholar
  15. 15.
    Khoury N, Zavarsky P, Lindskog D, Ruhl R (2011) An analysis of black-box web application security scanners against stored SQL injection. In: Proceedings of the IEEE Third International Conference on Privacy, Security, Risk and Trust (PASSAT) and IEEE Third Inernational Conference on Social Computing (SocialCom), Boston, MA, pp 1095–1101Google Scholar
  16. 16.
    Daud NI, Bakar KAA, Hasan MSMd (2014) A case study on web application vulnerability scanning tools. In: Proceedings of the Conference of Science and Information (SAI), IEEE, pp 595–600Google Scholar
  17. 17.
    Suteva N, Zlatkovski D, Mileva A (2013) Evaluation and testing of several free/open source web vulnerability scanners. In: Proceedings of the 10th Conference for Informatics and Information Technology (CIIT 2013), Bitola, MK, pp 221–224Google Scholar
  18. 18.
    (2016) Snort—Network Intrusion Detection and Prevention System. https://www.snort.org/. Accessed 29 April 2016
  19. 19.
    Alnabulsi H, Islam Md.R, Mamun Q (2014) Detecting SQL injection attacks using SNORT IDS. In: Proceedings of the 2014 Asia-Pacific World Congress on Computer Science and Engineering (APWC on CSE), IEEE, pp 1–7Google Scholar
  20. 20.
    Dabbour M, Alsmadi I, Alsukhni E (2013) Efficient assessment and evaluation for websites vulnerabilities using SNORT. Int J Secur Appl 7(1)Google Scholar
  21. 21.
    HP (2015) HP WebInsPect. Product Manual, HPGoogle Scholar
  22. 22.
    Arachni (2016) ARACHNI web application security scanner framework. http://www.arachni-scanner.com. Accessed 29 April 2016
  23. 23.
    The Open Web Application Security Project OWASP (2013) OWASP Top 10—2013 the ten most critical web application security risks. Release, the open web application security project OWASPGoogle Scholar
  24. 24.
    Firns I (2016) Dedicated Spooler for Snort’s Unified2 Binary Output Format. https://github.com/firnsy/barnyard2. Accessed 29 April 2016
  25. 25.
    Willis Webber D (2016) Ruby on rails application for network security monitoring. https://github.com/Snorby/snorby. Accessed 29 April 2016
  26. 26.
    Doupé A, Cova M, Vigna G (2010) Detection of intrusions and malware, and vulnerability assessment. In: Kreibich C, Jahnke M (eds), Proceedings of the 7th International Conference (DIMVA 2010), Bonn, Germany, pp 111–131Google Scholar
  27. 27.
    Doupé A (2016) WackoPicko vulnerable website. https://github.com/adamdoupe/WackoPicko. Accessed 22 March 2016
  28. 28.
    Network Statistics Gatherer (2016) https://unix4lyfe.org/darkstat/. Accessed 29 April 2016

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  • Fernando Román Muñoz
    • 1
  • Esteban Alejandro Armas Vega
    • 1
  • Luis Javier García Villalba
    • 1
  1. 1.Group of Analysis, Security and Systems (GASS), Department of Software Engineering and Artificial Intelligence (DISIA), Faculty of Computer Science and Engineering, Office 431Universidad Complutense de Madrid (UCM)MadridSpain

Personalised recommendations