Maintainability of Automatic Acceptance Tests for Web Applications—A Case Study Comparing Two Approaches to Organizing Code of Test Cases

  • Aleksander Sadaj
  • Mirosław OchodekEmail author
  • Sylwia Kopczyńska
  • Jerzy Nawrocki
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 12011)


[Context] Agile software development calls for test automation since it is critical for continuous development and delivery. However, automation is a challenging task especially for tests of user interface, which can be very expensive. [Problem] There are two extreme approaches of structuring the code of test duties for web-applicating, i.e., linear scripting and keyword-driven scripting technique employing the page object pattern. The goal of this research is to compare them focusing on the maintainability aspect. [Method] We develop and maintain two automatic test suites implementing the same test cases for a mature open-source system using these two approaches. For each approach, we measure the size of the testing codebase and the number of lines of code that need to be modified to keep the test suites passing and valid through five releases of the system. [Results] We observed that the total number of physical lines was higher for the keyword-driven approach than for the linear scripting one. However, the number of programmatical lines of code was smaller for the former. The number of lines of code that had to be modified to maintain the tests was lower for the keyword-driven scripting test suite than for the linear-scripting one. We found the linear-scripting technique was more difficult to maintain because the scripts consist only of low-level code directly interacting with a web browser making it hard to understand the purpose and broader context of the interaction they implement. [Conclusions] We conclude that test suites created using the keyword-driven approach are easier to maintain and more suitable for most of the projects. However, the results show that the linear scripting approach could be considered as a less expensive alternative for small projects that are not likely to be frequently modified in the future.


Acceptance testing Keyword-driven testing Linear scripting Web applications Selenium Cucumber 


  1. 1.
    Carvalho, R.: A comparative study of GUI testing approaches (2016)Google Scholar
  2. 2.
    Cohn, M.: Succeeding with Agile: Software Development Using Scrum. Pearson Education, London (2010)Google Scholar
  3. 3.
    Dees, I., Wynne, M., Hellesoy, A.: Cucumber Recipes: Automate Anything with BDD Tools and Techniques. Pragmatic Bookshelf, Raleigh (2013)Google Scholar
  4. 4.
    Fowler, M.: Accessed 25 Oct 2019
  5. 5.
    Garousi, V., Mika, M.: When and what to automate in software testing? A multi-vocal literature review. Inf. Softw. Technol. 76, 92–117 (2016)CrossRefGoogle Scholar
  6. 6.
    Leotta, M., Clerissi, D., Ricca, F., Spadaro, C.: Comparing the maintainability of selenium WebDriver test suites employing different locators: a case study. In: Joining AcadeMiA and Industry Contributions to Testing Automation (JAMAICA) (2013)Google Scholar
  7. 7.
    Leotta, M., Clerissi, D., Ricca, F., Tonella, P.: Capture-replay vs. programmable web testing: an empirical assessment during test case evolution. In: WCRE 2013, Koblenz, Germany, pp. 272–281 (2013)Google Scholar
  8. 8.
    Leotta, M., Ricca, F., Stocco, A., Tonella, P.: Reducing web test cases aging by means of robust XPath locators. IEEE (2013)Google Scholar
  9. 9.
    Leotta, M., Stocco, A., Ricca, F., Tonella, P.: Using multi-locators to increase the robustness of web test cases. IEEE (2015)Google Scholar
  10. 10.
    Mg, R.P.: Learning Selenium Testing Tools. Packt Publishing Ltd., Birmingham (2015)Google Scholar
  11. 11. Moodle – Open-source learning platform (2019). Accessed 17 June 2019
  12. 12.
    Natarajan, S., Balasubramaniam, K., Kanitkar, M.: Efficiency and cost containment in quality assurance, 10th edn. Capgemini, Micro Focus, Sogeti, World Quality Report 2018-19 (2019)Google Scholar
  13. 13.
    Ochodek, M., Kopczyńska, S.: Perceived importance of agile requirements engineering practices-a survey. J. Syst. Softw. 143, 29–43 (2018)CrossRefGoogle Scholar
  14. 14. Getting Started with Page Object Pattern for Your Selenium Tests (2019). Accessed 13 June 2019
  15. 15.
    Runeson, P., Host, M., Rainer, A., Regnell, B.: Case Study Research in Software Engineering: Guidelines and Examples. Wiley, Hoboken (2012)CrossRefGoogle Scholar
  16. 16.
    Sadaj, A.: Maintainability of automatic acceptance tests for web applications–a case study comparing two approaches to organizing code of test cases. Master’s thesis, Poznan University of Technology (2019)Google Scholar
  17. 17.
    Spinellis, D.: State-of-the-art software testing. IEEE Softw. 34(5), 4–6 (2017)CrossRefGoogle Scholar
  18. 18.
    Ståhl, D., Bosch, J.: Modeling continuous integration practice differences in industry software development. J. Syst. Softw. 87, 48–59 (2014)CrossRefGoogle Scholar
  19. 19.
    Tahchiev, P., Leme, F., Massol, V., Gregory, G.: JUnit in Action. Manning Publications Co., Greenwich (2010)Google Scholar
  20. 20.
    Yin, R.: Case Study Research: Design and Methods. SAGE Publications, Thousand Oaks (2003)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Poznan University of TechnologyPoznańPoland

Personalised recommendations