Skip to main content
Log in

An empirical study on combining diverse static analysis tools for web security vulnerabilities based on development scenarios

  • Published:
Computing Aims and scope Submit manuscript


Automated Static Analysis Tool (ASATs) are one of the best ways to search for vulnerabilities in applications, so they are a resource widely used by developers to improve their applications. However, it is well-known that the performance of such tools is limited, and their detection capabilities may not meet the requirements of the project regarding the criticality of the application. Diversity is an obvious direction to take to improve the true positives, as different tools usually report distinct vulnerabilities, however with the cost of also increasing the false positives, which may be unacceptable in some scenarios. In this paper, we study the problem of combining diverse ASATs to improve the overall detection of vulnerabilities in web applications, considering four development scenarios with different criticality goals and constraints. These scenarios range from low budget to high-end (e.g., business critical) web applications. We tested with five ASATs under two datasets, one with real WordPress plugins and another with synthetic test cases. Our findings revealed that combining the outputs of several ASATs does not always improve the vulnerability detection performance over a single ASAT. By using our procedure a developer is able to choose which is the best combination of ASATs that fits better in the project requirements.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others


  1. Accessed 17 Mar 2017

  2. Accessed 20 Mar 2017

  3. (2011). Accessed 6 Apr 2017

  4. WPScan Vulnerability Database. Accessed 26 Oct 2015

  5. Website hacked trend report 2016-Q1 (2016) Accessed 6 Apr 2017

  6. Wordpress plugin directory. Accessed 29 Dec 2016

  7. NIST SARD Project. Accessed 23 Feb 2017

  8. Accessed 09 March 2017

  9. Accessed March 2018

  10. Antunes N, Vieira M (2015) On the metrics for benchmarking vulnerability detection tools. In: 2015 45th Annual IEEE/IFIP international conference on dependable systems and networks, pp 505–516

  11. Backes M, Rieck K, Skoruppa M, Stock B, Yamaguchi F (2017) Efficient and flexible discovery of PHP application vulnerabilities. In: 2017 IEEE european symposium on security and privacy (EuroS&P), pp 334–349. IEEE.

  12. Baggen R, Correia JP, Schill K, Visser J (2012) Standardized code quality benchmarking for improving software maintainability. Softw Qual J 20(2):287–307

    Article  Google Scholar 

  13. Beller M, Bholanath R, McIntosh S, Zaidman A (2016) Analyzing the state of static analysis: a large-scale evaluation in open source software. In: 2016 IEEE 23rd international conference on software analysis, evolution, and reengineering, vol 1, pp 470–481

  14. Dahse J, Holz T (2014) Simulation of built-in PHP features for precise static code analysis. In: Proceedings 2014 network and distributed system security symposium. Internet Society, Reston, VA

  15. Díaz G, Bermejo JR (2013) Static analysis of source code security: assessment of tools against SAMATE tests. Inf Softw Technol 55(8):1462–1476

    Article  Google Scholar 

  16. Forbes: will the demand for developers continue to increase? Accessed 15 May 2017

  17. Goseva-Popstojanova K, Perhinschi A (2015) On the capability of static code analysis to detect security vulnerabilities. Inf Softw Technol 68:18–33

    Article  Google Scholar 

  18. Hauzar D, Kofron J (2015) Framework for Static Analysis of PHP Applications. In: Boyland JT (ed) 29th European conference on object-oriented programming (ECOOP 2015), Leibniz international proceedings in informatics (LIPIcs), vol 37, pp 689–711. Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik, Dagstuhl, Germany

  19. Imperva: Imperva web application attack report (WAAR). (2015). Accessed 22 May 2017

  20. Institute P (2015) Annual consumer studies. Accessed 22 May 2017

  21. Johnson B, Song Y, Murphy-Hill E, Bowdidge R (2013) Why don’t software developers use static analysis tools to find bugs? In: 35th International conference on software engineering. IEEE, pp 672–681

  22. Jovanovic N, Kruegel C, Kirda E (2006) Pixy: a static analysis tool for detecting web application vulnerabilities. In: 2006 IEEE symposium on security and privacy, pp 6–263

  23. Landi W (1992) Undecidability of static analysis. ACM Lett Program Lang Syst 1(4):323–337

    Article  Google Scholar 

  24. Livshits VB, Lam MS (2005) Finding security vulnerabilities in java applications with static analysis. In: Proceedings of the 14th conference on USENIX security symposium, vol 14, SSYM’05. USENIX Association, Berkeley, CA, USA, pp 18–18

  25. Meade FG. Accessed 5 May 2017

  26. Medeiros I, Neves NF, Correia M (2014) Automatic detection and correction of web application vulnerabilities using data mining to predict false positives. In: Proceedings of the 23rd international conference on world wide web, WWW ’14. ACM, NY, USA, pp 63–74

  27. Meng N, Wang Q, Wu Q, Mei H (2008) An approach to merge results of multiple static analysis tools (short paper). In: 2008 The eighth international conference on quality software, pp 169–174

  28. NIST: Software assurance metrics and tool evaluation. Accessed 28 Nov 2016

  29. Nunes P. Accessed 15 July 2018

  30. Nunes P, Fonseca J, Vieira M (2015) phpSAFE: a security analysis tool for OOP web application plugins. In: 45th Annual IEEE/IFIP international conference on dependable systems and networks, DSN 2015, Rio de Janeiro, Brazil, June 22–25, 2015, pp 299–306

  31. Nunes P, Medeiros I, Fonseca J, Neves N, Correia M, Vieira M (2017) On combining diverse static analysis tools for web security: an empirical study. In: 2017 13th European dependable computing conference (EDCC), pp 121–128

  32. Pichler M. PHP depend. Accessed 03 Nov 2016

  33. Rutar N, Almazan CB, Foster JS (2004) A comparison of bug finding tools for java. In: Proceedings of the 15th international symposium on software reliability engineering, ISSRE ’04. IEEE Computer Society, Washington, DC, USA, pp 245–256

  34. Stivalet B, Fong E (2016) Large scale generation of complex and faulty PHP test cases. In: 2016 IEEE International conference on software testing, verification and validation (ICST), pp 409–415

  35. Vogt P, Nentwich F, Jovanovic N, Kirda E, Kruegel C, Vigna G (2007) Cross site scripting prevention with dynamic data tainting and static analysis. In: NDSS, vol 2007, p 12

  36. Wang Q, Meng N, Zhou Z, Li J, Mei H (2008) Towards SOA-based code defect analysis. In: IEEE international symposium on service-oriented system engineering, 2008. SOSE ’08, pp 269–274

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Paulo Nunes.

Additional information

This work extends a preliminary version presented at the 13th European Dependable Computing Conference (EDCC 2017)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nunes, P., Medeiros, I., Fonseca, J. et al. An empirical study on combining diverse static analysis tools for web security vulnerabilities based on development scenarios. Computing 101, 161–185 (2019).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:


Mathematics Subject Classification