Advertisement

The Journal of Supercomputing

, Volume 74, Issue 3, pp 1061–1089 | Cite as

An algorithm to find relationships between web vulnerabilities

  • Fernando Román Muñoz
  • Luis Javier García Villalba
Article

Abstract

Over the past years, there has been a high increase in web sites using cloud computing. Like usual web site, those web applications can have most of the common web vulnerabilities, like SQL injection or cross-site scripting. Therefore, cloud computing has become more attractive to cyber criminals. Besides, in many cases it is necessary to comply with regulations like PCI DSS or standards like ISO/IEC 27001. To face those threats and requirements it is a common task to analyze web applications to detect and correct their vulnerabilities. The most used tools to analyze web applications are automatic scanners. But it is difficult to comparatively decide which scanner is best or at least is best suited to detect a particular vulnerability. To evaluate scanner capabilities some evaluation criteria have been defined. Often a web vulnerability classification is also used to evaluate scanners, but current web vulnerability classifications do not usually include all vulnerabilities. To face evaluation criteria which are not up-to-date and to have the fullest possible classification, in this paper a new method to map web vulnerability classifications is proposed. The result will be the vulnerabilities an automatic scanner has to detect. As classifications change over time, this new method could be executed when the existing classifications change or when new classifications are developed. The vulnerabilities described this way can also be seen as a web vulnerability classification that includes all vulnerabilities in the classifications taken into account.

Keywords

Cloud computing vulnerabilities Cyber-security Vulnerability classifications Web vulnerabilities 

Notes

Acknowledgments

This work was funded by the European Commission Horizon 2020 Programme under Grant Agreement Number H2020-FCT-2015/700326-RAMSES (Internet Forensic Platform for Tracking the Money Flow of Financially Motivated Malware).

References

  1. 1.
    Acunetix (2015) Acunetix web vulnerability scanner. http://www.acunetix.com/. Accessed 30 Sept 2015
  2. 2.
    HP (2015) Hp webinspect. http://www.spidynamics.com/products/. Accessed 30 Sept 2015
  3. 3.
    IBM (2015) IBM rational appscan. http://www-01.ibm.com/software/awdtools/appscan/. Accessed 30 Sept 2015
  4. 4.
    Open Web Application Security Project (2015) Owasp zed attack proxy. https://www.owasp.org/index.php/OWASP_Zed_Attack_Proxy_Project.Accessed 30 Sept 2015
  5. 5.
    PCI Security Standards Council (2016) PCI SSC data security standards. https://www.pcisecuritystandards.org. Accessed 30 Sept 2015
  6. 6.
    Akowuah F, Lake J, Yuan X, Nuakoh E, Yu H (2015) Testing the security vulnerabilities of openemr 4.1.1: a case study. J Comput Sci Coll 30(3):26–35Google Scholar
  7. 7.
    Assad RE, Katter T, Ferraz F, de Lemos Meira S (2010) Security quality assurance on web-based application through security requirements tests based on owasp test document: elaboration, execution and automation. In: Proceedings of the 2nd OWASP Ibero-American Web Applications Security ConferenceGoogle Scholar
  8. 8.
    Austin A, Smith B, Williams L (2010) Towards improved security criteria for certification of electronic health record systems. In: Proceedings of the 2010 ICSE Workshop on Software Engineering in Health Care, SEHC ’10, New York, pp 68–73Google Scholar
  9. 9.
    Bau J, Bursztein E, Gupta D, Mitchell J (2010) State of the art: automated black-box web application vulnerability testing. In: Proceedings of the IEEE symposium on Security and Privacy (SP), pp 332–345Google Scholar
  10. 10.
    Bergvall J, Svensson L (2015) Risk analysis review. In: Linkopings universitetGoogle Scholar
  11. 11.
    Bhattacharjee J, Sengupta A, Mazumdar C, Barik MS (2012) A two-phase quantitative methodology for enterprise information security risk analysis. In: Proceedings of the CUBE International Information Technology Conference, CUBE ’12, New York, pp 809–815Google Scholar
  12. 12.
    Black PE, Kass M (2005) Software security assurance tools, techniques and metrics (SSATTM). In: Proceedings of the 20th IEEE/ACM International Conference on Automated Software Engineering, ASE ’05, New York, pp 461–461Google Scholar
  13. 13.
    Blanco C, Lasheras J, Valencia-García R, Fernández-Medina E, Toval A, Piattini M (2008) A systematic review and comparison of security ontologies. In: Proceedings of the 2008 Third International Conference on Availability, Reliability and Security, ARES ’08. IEEE Computer Society, Washington, DC, pp 813–820Google Scholar
  14. 14.
    Blei DM, Ng AY, Jordan MI (2003) Latent Dirichlet allocation. J Mach Learn Res 3:993–1022zbMATHGoogle Scholar
  15. 15.
    Chen S (2012) General features comparison—web application scanners. http://www.sectoolmarket.com
  16. 16.
    Cornel D (2010) Mapping between owasp top 10 (2004, 2007), wasc 24+2 and sans cwe/25. http://blog.denimgroup.com/denim_group/2010/01/mapping-between-owasp-top-10-2004-2007-wasc-242-and-sans-cwe25.html
  17. 17.
    Corporation TM (2013) Common weakness enumeration. http://cwe.mitre.org
  18. 18.
    Corporation TM (2013) Cwe. http://cve.mitre.org
  19. 19.
    Dahbur K, Mohammad B, Tarakji AB (2011) A survey of risks, threats and vulnerabilities in cloud computing. In: Proceedings of the 2011 International Conference on Intelligent Semantic Web-Services and Applications, ISWSA ’11, New York, pp 12:1–12:6Google Scholar
  20. 20.
    Daz-Lpez D, Dlera-Tormo G, Gmez-Mrmol F, Martnez-Prez G (2014) Dynamic counter-measures for risk-based access control systems: an evolutive approach. Future Generation Computer SystemsGoogle Scholar
  21. 21.
    Demchenko Y, Gommans L, de Laat C, Oudenaarde B (2005) Web services and grid security vulnerabilities and threats analysis and model. In: Proceedings of the 6th IEEE/ACM International Workshop on Grid Computing, GRID ’05. IEEE Computer Society, Washington, DC, pp 262–267Google Scholar
  22. 22.
    Doupé A, Cavedon L, Kruegel C, Vigna G (2012) Enemy of the state: a state-aware black-box web vulnerability scanner. In: Proceedings of the 21st USENIX Conference on Security Symposium, Security’12. USENIX Association, Berkeley, p 26Google Scholar
  23. 23.
    Doupé A, Cova M, Vigna G (2010) Why johnny can’t pentest: an analysis of black-box web vulnerability scanners. In: Proceedings of the 7th International Conference on Detection of Intrusions and Malware, and Vulnerability Assessment, DIMVA’10. Springer, Berlin, pp 111–131Google Scholar
  24. 24.
    Ferreira AM, Klepee H (2011) Effectiveness of automated application penetration testing toolsGoogle Scholar
  25. 25.
    Fong E, Okun V (2007) Web application scanners: definitions and functions. In: Proceedings of the 40th Annual Hawaii International Conference on System Sciences, HICSS ’07, p. 280b. IEEE Computer Society, Washington, DCGoogle Scholar
  26. 26.
    Fonseca J, Vieira M, Madeira H (2007) Testing and comparing web vulnerability scanning tools for sql injection and xss attacks. In: Proceedings of the 13th Pacific Rim International Symposium on Dependable Computing, PRDC ’07. IEEE Computer Society, Washington, DC, pp 365–372Google Scholar
  27. 27.
    Fonseca J, Vieira M, Madeira H (2014) Evaluation of web security mechanisms using vulnerability & attack injection. IEEE Trans Dependable Secur Comput 11(5):440–453CrossRefGoogle Scholar
  28. 28.
    Gonzalez H, Halevy AY, Jensen CS, Langen A, Madhavan J, Shapley R, Shen W, Goldberg-Kidon J (2010) Google fusion tables: web-centered data management and collaboration. Proceedings of the 2010 ACM SIGMOD International Conference on Management of Data, SIGMOD ’10. NY, USA, New York, pp 1061–1066Google Scholar
  29. 29.
    Grossman J (2013) Wasc threat classification to owasp top ten rc1 mapping . http://jeremiahgrossman.blogspot.com.es/2010/01/wasc-threat-classification-to-owasp-top.html
  30. 30.
    Gupta S, Sharma L (2011) Analysis and assessment of web application security testing tools. In: Proceedings of the 5th National ConferenceGoogle Scholar
  31. 31.
    Haley C, Laney R, Nuseibeh B (2004) Deriving security requirements from crosscutting threat descriptions, execution and automation. In: Proceedings of the 3rd International Conference on Aspect-Oriented Software Development, pp 112–121Google Scholar
  32. 32.
    Hashizume K, Rosado DG, Fernández-Medina E, Fernandez EB (2013) An analysis of security issues for cloud computing. J Internet Serv Appl 4(1):1–13CrossRefGoogle Scholar
  33. 33.
    Lannacone M, Bohn S, Nakamura G, Gerth J, Huffer K, Bridges R, Ferragut E, Goodall J (2015) Developing an ontology for cyber security knowledge graphs. In: Proceedings of the 10th Annual Cyber and Information Security Research Conference, CISR ’15, New York, pp 12:1–12:4Google Scholar
  34. 34.
    International Organization for Standardization (2005) International standard iso/iec 27001Google Scholar
  35. 35.
    Jiang Y, Li X, Meng W (2014) Discword: learning discriminative topics. In: Proceedings of the 2014 IEEE/WIC/ACM International Joint Conferences on Web Intelligence (WI) and Intelligent Agent Technologies (IAT), vol 02, WI-IAT ’14. IEEE Computer Society, Washington, DC, pp 63–70Google Scholar
  36. 36.
    Jones CL, Bridges RA, Huffer KMT, Goodall JR (2015) Towards a relation extraction framework for cyber-security concepts. In: Proceedings of the 10th Annual Cyber and Information Security Research Conference, CISR ’15, New York, pp. 11:1–11:4Google Scholar
  37. 37.
    Jurcenoks J (2013) Owasp to wasc to cwe mapping correlating different industry taxonomy. Critical WatchGoogle Scholar
  38. 38.
    Khalili A, Sami A, Ghiasi M, Moshtari S, Salehi Z, Azimi M (2014) Software engineering issues regarding securing ics: an industrial case study. In: Proceedings of the 1st International Workshop on Modern Software Engineering Methods for Industrial Automation, MoSEMInA 2014, New York, pp 1–6Google Scholar
  39. 39.
    Kim H, Choo J, Kim J, Reddy CK, Park H (2015) Simultaneous discovery of common and discriminative topics via joint nonnegative matrix factorization. In: Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’15, New York, pp 567–576Google Scholar
  40. 40.
    Kumar R, Singh H (2013) A qualitative analysis of effects of security risks on architecture of an information system. SIGSOFT Softw Eng Notes 38(6):1–3CrossRefGoogle Scholar
  41. 41.
    Loh P, Subramanian D (2010) Fuzzy classification metrics for scanner assessment and vulnerability reporting. IEEE Trans Inf Forensics Secur 5(4):613–624CrossRefGoogle Scholar
  42. 42.
    Martin RA, Barnum S (2008) Common weakness enumeration (cwe) status update. Ada Lett 28(1):88–91CrossRefGoogle Scholar
  43. 43.
    Martin RA, Barnum S (2008) Creating the secure software testing target list. In: Proceedings of the 4th Annual Workshop on Cyber Security and Information Intelligence Research: Developing Strategies to Meet the Cyber Security and Information Intelligence Challenges Ahead, CSIIRW ’08, New York, pp 33:1–33:2Google Scholar
  44. 44.
    Martin RA, Christey S, Jarzombek J (2005) The case for common flaw enumeration. NIST Workshop on Software Security Assurance Tools, Techniques, and MetricsGoogle Scholar
  45. 45.
    Martirosyan Y (2013) Security evaluation of web application vulnerability scanners strengths and limitations using custom web application. In: California State UniversityGoogle Scholar
  46. 46.
    Mcquade K (2014) Open source web vulnerability scanners: the cost effective choice? In: Proceedings of the Conference for Information Systems Applied Research, BaltimoreGoogle Scholar
  47. 47.
    Michell S (2013) Programming language vulnerabilities: proposals to include concurrency paradigms. Ada Lett 33(1):101–115MathSciNetCrossRefGoogle Scholar
  48. 48.
    Mulwad V, Li W, Joshi A, Finin T, Viswanathan K (2011) Extracting information about security vulnerabilities from web text. In: Proceedings of the 2011 IEEE/WIC/ACM International Conferences on Web Intelligence and Intelligent Agent Technology, vol 03, WI-IAT ’11. IEEE Computer Society, Washington, DC, pp 257–260Google Scholar
  49. 49.
    National Institute of Standards and Technology: Software assurance tools: web application security scanner functional specification version 1.0. NIST special publication 500-269Google Scholar
  50. 50.
    National Institute of Standards and Technology (2011) Nist special publication 800-30 revision 1: guide for conducting risk assessments. NIST special publication, p 95Google Scholar
  51. 51.
    Njogu HW, Jiawei L, Kiere JN, Hanyurwimfura D (2013) A comprehensive vulnerability based alert management approach for large networks. Future Gener Comput Syst 29(1):27–45CrossRefGoogle Scholar
  52. 52.
    Nuno Teodoro CS (2010) Automating web applications security assessments through scanners. In: Proceedings of the OWASP Ibero-American Web Applications Security ConferenceGoogle Scholar
  53. 53.
    Open Web Application Security Project (2008) OWASP testing guide v3.https://www.owasp.org
  54. 54.
    Open Web Application Security Project (2014) OWASP testing guide v4.https://www.owasp.org
  55. 55.
    Parmar S (2015) Vulnerability checker for infosecurity. Int J Sci Res (IJSR) 4(3):1593–1596MathSciNetGoogle Scholar
  56. 56.
    Prashanth S, Sambasiva N (2015) Vulnerability, threats and its countermeasure in cloud computing. Int J Comput Sci Mobile Comput 4(6):126–130Google Scholar
  57. 57.
    Project OTT (2013) Open web application security project. https://code.google.com/p/owasptop10
  58. 58.
    Román Muñoz F, García Villalba LJ (2013) Methods to test web applications scanners. Amman, JordanGoogle Scholar
  59. 59.
    Román Muñoz F, García Villalba LJ (2015) Web from preprocessor for crawling. Multimed Tools Appl 74(19):8559–8570CrossRefGoogle Scholar
  60. 60.
    Román Muñoz F, García Villalba LJ (2015) Web vulnerability classification mappings 1. http://vulmappings.esy.es
  61. 61.
    Saeed FA (2014) Using wassec to evaluate commercial web application security scanners. Int J Soft Comput Eng (IJSCE) 4(1):177–181Google Scholar
  62. 62.
    SANS (2011) Cwe/sans top 25 most dangerous software errors. http://www.sans.org/top25-software-errors
  63. 63.
    Srivatsa S, Nagasundaram S (2015) Guidelines for security in cloud computing. Netw Commun Eng 7(7):305–306Google Scholar
  64. 64.
    Suto L (2010) Analyzing the accuracy and time costs of web application security scanners. In: Beyond TrustGoogle Scholar
  65. 65.
  66. 66.
    The MITRE Corporation (2013) Common attack patterns enumeration and classfication. http://capec.mitre.org/
  67. 67.
    Tripp O, Weisman O, Guy L (2013) Finding your way in the testing jungle: A learning approach to web security testing. Proceedings of the 2013 International Symposium on Software Testing and Analysis, ISSTA 2013, New York, pp 347–357Google Scholar
  68. 68.
    Web Application Security Consortium (2009) Web application security scanner evaluation criteria. http://projects.webappsec.org/w/page/13246986/Web%20Application%20Security%20Scanner%20Evaluation%20Criteria
  69. 69.
    Web Application Security Consortium (2010) The WASC threat classification. http://projects.webappsec.org/w/page/13246978/Threat Classification
  70. 70.
    Web Application Security Consortium (2012) Threat classification taxonomy cross reference view. http://projects.webappsec.org/w/page/13246975/Threat%20Classification%20Taxonomy%20Cross%20Reference%20View
  71. 71.
    Weber S, Karger PA, Paradkar A (2005) A software flaw taxonomy: aiming tools at security. SIGSOFT Softw Eng Notes 30(4):1–7CrossRefGoogle Scholar
  72. 72.
    Weber S, Karger PA, Paradkar A (2005) A software flaw taxonomy: aiming tools at security. In: Proceedings of the 2005 Workshop on Software Engineering for Secure Systems—Building Trustworthy Applications, SESS ’05, New York, pp 1–7Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  • Fernando Román Muñoz
    • 1
  • Luis Javier García Villalba
    • 1
  1. 1.Group of Analysis, Security and Systems (GASS), Department of Software Engineering and Artificial Intelligence (DISIA)Faculty of Information Technology and Computer Science, Office 431, Universidad Complutense de Madrid (UCM)MadridSpain

Personalised recommendations