, Volume 101, Issue 2, pp 161–185 | Cite as

An empirical study on combining diverse static analysis tools for web security vulnerabilities based on development scenarios

  • Paulo NunesEmail author
  • Ibéria Medeiros
  • José Fonseca
  • Nuno Neves
  • Miguel Correia
  • Marco Vieira


Automated Static Analysis Tool (ASATs) are one of the best ways to search for vulnerabilities in applications, so they are a resource widely used by developers to improve their applications. However, it is well-known that the performance of such tools is limited, and their detection capabilities may not meet the requirements of the project regarding the criticality of the application. Diversity is an obvious direction to take to improve the true positives, as different tools usually report distinct vulnerabilities, however with the cost of also increasing the false positives, which may be unacceptable in some scenarios. In this paper, we study the problem of combining diverse ASATs to improve the overall detection of vulnerabilities in web applications, considering four development scenarios with different criticality goals and constraints. These scenarios range from low budget to high-end (e.g., business critical) web applications. We tested with five ASATs under two datasets, one with real WordPress plugins and another with synthetic test cases. Our findings revealed that combining the outputs of several ASATs does not always improve the vulnerability detection performance over a single ASAT. By using our procedure a developer is able to choose which is the best combination of ASATs that fits better in the project requirements.


Static analysis Vulnerability detection XSS SQLi 

Mathematics Subject Classification

68M15 Reliability 68M11 Internet topics 


  1. 1.
  2. 2.
  3. 3.
  4. 4.
    WPScan Vulnerability Database. Accessed 26 Oct 2015
  5. 5.
    Website hacked trend report 2016-Q1 (2016) Accessed 6 Apr 2017
  6. 6.
    Wordpress plugin directory. Accessed 29 Dec 2016
  7. 7.
    NIST SARD Project. Accessed 23 Feb 2017
  8. 8.
  9. 9. Accessed March 2018
  10. 10.
    Antunes N, Vieira M (2015) On the metrics for benchmarking vulnerability detection tools. In: 2015 45th Annual IEEE/IFIP international conference on dependable systems and networks, pp 505–516Google Scholar
  11. 11.
    Backes M, Rieck K, Skoruppa M, Stock B, Yamaguchi F (2017) Efficient and flexible discovery of PHP application vulnerabilities. In: 2017 IEEE european symposium on security and privacy (EuroS&P), pp 334–349. IEEE.
  12. 12.
    Baggen R, Correia JP, Schill K, Visser J (2012) Standardized code quality benchmarking for improving software maintainability. Softw Qual J 20(2):287–307CrossRefGoogle Scholar
  13. 13.
    Beller M, Bholanath R, McIntosh S, Zaidman A (2016) Analyzing the state of static analysis: a large-scale evaluation in open source software. In: 2016 IEEE 23rd international conference on software analysis, evolution, and reengineering, vol 1, pp 470–481Google Scholar
  14. 14.
    Dahse J, Holz T (2014) Simulation of built-in PHP features for precise static code analysis. In: Proceedings 2014 network and distributed system security symposium. Internet Society, Reston, VAGoogle Scholar
  15. 15.
    Díaz G, Bermejo JR (2013) Static analysis of source code security: assessment of tools against SAMATE tests. Inf Softw Technol 55(8):1462–1476CrossRefGoogle Scholar
  16. 16.
    Forbes: will the demand for developers continue to increase? Accessed 15 May 2017
  17. 17.
    Goseva-Popstojanova K, Perhinschi A (2015) On the capability of static code analysis to detect security vulnerabilities. Inf Softw Technol 68:18–33CrossRefGoogle Scholar
  18. 18.
    Hauzar D, Kofron J (2015) Framework for Static Analysis of PHP Applications. In: Boyland JT (ed) 29th European conference on object-oriented programming (ECOOP 2015), Leibniz international proceedings in informatics (LIPIcs), vol 37, pp 689–711. Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik, Dagstuhl, GermanyGoogle Scholar
  19. 19.
    Imperva: Imperva web application attack report (WAAR). (2015). Accessed 22 May 2017
  20. 20.
    Institute P (2015) Annual consumer studies. Accessed 22 May 2017
  21. 21.
    Johnson B, Song Y, Murphy-Hill E, Bowdidge R (2013) Why don’t software developers use static analysis tools to find bugs? In: 35th International conference on software engineering. IEEE, pp 672–681Google Scholar
  22. 22.
    Jovanovic N, Kruegel C, Kirda E (2006) Pixy: a static analysis tool for detecting web application vulnerabilities. In: 2006 IEEE symposium on security and privacy, pp 6–263Google Scholar
  23. 23.
    Landi W (1992) Undecidability of static analysis. ACM Lett Program Lang Syst 1(4):323–337CrossRefGoogle Scholar
  24. 24.
    Livshits VB, Lam MS (2005) Finding security vulnerabilities in java applications with static analysis. In: Proceedings of the 14th conference on USENIX security symposium, vol 14, SSYM’05. USENIX Association, Berkeley, CA, USA, pp 18–18Google Scholar
  25. 25.
  26. 26.
    Medeiros I, Neves NF, Correia M (2014) Automatic detection and correction of web application vulnerabilities using data mining to predict false positives. In: Proceedings of the 23rd international conference on world wide web, WWW ’14. ACM, NY, USA, pp 63–74Google Scholar
  27. 27.
    Meng N, Wang Q, Wu Q, Mei H (2008) An approach to merge results of multiple static analysis tools (short paper). In: 2008 The eighth international conference on quality software, pp 169–174Google Scholar
  28. 28.
    NIST: Software assurance metrics and tool evaluation. Accessed 28 Nov 2016
  29. 29.
    Nunes P. Accessed 15 July 2018
  30. 30.
    Nunes P, Fonseca J, Vieira M (2015) phpSAFE: a security analysis tool for OOP web application plugins. In: 45th Annual IEEE/IFIP international conference on dependable systems and networks, DSN 2015, Rio de Janeiro, Brazil, June 22–25, 2015, pp 299–306Google Scholar
  31. 31.
    Nunes P, Medeiros I, Fonseca J, Neves N, Correia M, Vieira M (2017) On combining diverse static analysis tools for web security: an empirical study. In: 2017 13th European dependable computing conference (EDCC), pp 121–128Google Scholar
  32. 32.
    Pichler M. PHP depend. Accessed 03 Nov 2016
  33. 33.
    Rutar N, Almazan CB, Foster JS (2004) A comparison of bug finding tools for java. In: Proceedings of the 15th international symposium on software reliability engineering, ISSRE ’04. IEEE Computer Society, Washington, DC, USA, pp 245–256Google Scholar
  34. 34.
    Stivalet B, Fong E (2016) Large scale generation of complex and faulty PHP test cases. In: 2016 IEEE International conference on software testing, verification and validation (ICST), pp 409–415Google Scholar
  35. 35.
    Vogt P, Nentwich F, Jovanovic N, Kirda E, Kruegel C, Vigna G (2007) Cross site scripting prevention with dynamic data tainting and static analysis. In: NDSS, vol 2007, p 12Google Scholar
  36. 36.
    Wang Q, Meng N, Zhou Z, Li J, Mei H (2008) Towards SOA-based code defect analysis. In: IEEE international symposium on service-oriented system engineering, 2008. SOSE ’08, pp 269–274Google Scholar

Copyright information

© Springer-Verlag GmbH Austria, part of Springer Nature 2018

Authors and Affiliations

  • Paulo Nunes
    • 1
    • 4
    Email author
  • Ibéria Medeiros
    • 2
  • José Fonseca
    • 1
    • 4
  • Nuno Neves
    • 2
  • Miguel Correia
    • 3
  • Marco Vieira
    • 4
  1. 1.Unidade de Investigação para o Desenvolvimento do InteriorGuardaPortugal
  2. 2.LASIGE, Faculdade de CiênciasUniversidade de LisboaLisbonPortugal
  3. 3.INESC-ID, Instituto Superior TécnicoUniversidade de LisboaLisbonPortugal
  4. 4.CISUCUniversity of CoimbraCoimbraPortugal

Personalised recommendations