Advertisement

Empirical Software Engineering

, Volume 20, Issue 3, pp 694–744 | Cite as

Visual GUI testing in practice: challenges, problemsand limitations

  • Emil AlégrothEmail author
  • Robert Feldt
  • Lisa Ryrholm
Article

Abstract

In today’s software development industry, high-level tests such as Graphical User Interface (GUI) based system and acceptance tests are mostly performed with manual practices that are often costly, tedious and error prone. Test automation has been proposed to solve these problems but most automation techniques approach testing from a lower level of system abstraction. Their suitability for high-level tests has therefore been questioned. High-level test automation techniques such as Record and Replay exist, but studies suggest that these techniques suffer from limitations, e.g. sensitivity to GUI layout or code changes, system implementation dependencies, etc. Visual GUI Testing (VGT) is an emerging technique in industrial practice with perceived higher flexibility and robustness to certain GUI changes than previous high-level (GUI) test automation techniques. The core of VGT is image recognition which is applied to analyze and interact with the bitmap layer of a system’s front end. By coupling image recognition with test scripts, VGT tools can emulate end user behavior on almost any GUI-based system, regardless of implementation language, operating system or platform. However, VGT is not without its own challenges, problems and limitations (CPLs) but, like for many other automated test techniques, there is a lack of empirically-based knowledge of these CPLs and how they impact industrial applicability. Crucially, there is also a lack of information on the cost of applying this type of test automation in industry. This manuscript reports an empirical, multi-unit case study performed at two Swedish companies that develop safety-critical software. It studies their transition from manual system test cases into tests automated with VGT. In total, four different test suites that together include more than 300 high-level system test cases were automated for two multi-million lines of code systems. The results show that the transitioned test cases could find defects in the tested systems and that all applicable test cases could be automated. However, during these transition projects a number of hurdles had to be addressed; a total of 58 different CPLs were identified and then categorized into 26 types. We present these CPL types and an analysis of the implications for the transition to and use of VGT in industrial software development practice. In addition, four high-level solutions are presented that were identified during the study, which would address about half of the identified CPLs. Furthermore, collected metrics on cost and return on investment of the VGT transition are reported together with information about the VGT suites’ defect finding ability. Nine of the identified defects are reported, 5 of which were unknown to testers with extensive experience from using the manual test suites. The main conclusion from this study is that even though there are many challenges related to the transition and usage of VGT, the technique is still valuable, flexible and considered cost-effective by the industrial practitioners. The presented CPLs also provide decision support in the use and advancement of VGT and potentially other automated testing techniques similar to VGT, e.g. Record and Replay.

Keywords

Visual GUI Testing Industrial case study Challenges Problems and Limitations System and acceptance test automation Development cost 

Notes

Acknowledgments

The authors of this manuscript would like to thank Saab AB for their participation in these projects and their continued support in answering the question if Visual GUI Testing is an industrially applicable technique. We would also like to thank the reviewers and the editor for the constructive and valuable feedback on the manuscript.

References

  1. Adamoli A, Zaparanuks D, Jovic M, Hauswirth M (2011) Automated GUI performance testing. Softw Qual J 1–39Google Scholar
  2. Alegroth E, Feldt R, Olsson H (2013a) Transitioning Manual System Test Suites to Automated Testing: an Industrial Case Study. In: IEEE Sixth international conference on software testing, verification and validation, ICST 2013. IEEE, pp 56–65Google Scholar
  3. Alegroth E, Nass M, Olsson H (2013b) JAutomate: a tool for system-and acceptance-test automation. In: IEEE Sixth international conference on software testing, verification and validation (ICST), 2013. IEEE, pp 439–446Google Scholar
  4. Andersson J, Bache G (2004) The video store revisited yet again: adventures in GUI acceptance testing. In: Extreme programming and agile processes in software engineering, pp 1–10Google Scholar
  5. Beizer B (2002) Software testing techniques. Dreamtech PressGoogle Scholar
  6. Berner S, Weber R, Keller R (2005) Observations and lessons learned from automated testing. In: Proceedings of the 27th international conference on software engineering. ACM, pp 571–579Google Scholar
  7. Börjesson E (2010) Multi-perspective analysis of software development: a method and an industrial case study. CPLGoogle Scholar
  8. Börjesson E, Feldt R (2012) Automated system testing using visual gui testing tools: a comparative study in industry. In: IEEE fifth international conference on software testing, verification and validation, ICST 2012. IEEE, pp 350–359Google Scholar
  9. Cadar C, Godefroid P, Khurshid S, Pasareanu CS, Sen K, Tillmann N, Visser W (2011) Symbolic execution for software testing in practice: preliminary assessment. In: 33rd international conference on software engineering, ICSE 2011. IEEE, pp 1066–1071Google Scholar
  10. Chang T, Yeh T, Miller R (2010) GUI testing using computer vision. In: Proceedings of the 28th international conference on human factors in computing systems. ACM, pp 1535–1544Google Scholar
  11. Cheon Y, Leavens G (2006) A simple and practical approach to unit testing: The JML and JUnit way ECOOP 2002 object-oriented programming, pp 1789–1901Google Scholar
  12. Dustin E, Rashka J, Paul J (1999) Automated software testing: introduction, management, and performance. Addison-Wesley ProfessionalGoogle Scholar
  13. Ebert C (2007) The impacts of software product management. J Syst Softw 80(6):850–861CrossRefGoogle Scholar
  14. Finsterwalder M (2001) Automating acceptance tests for GUI applications in an extreme programming environment. In: Proceedings of the 2nd international conference on eXtreme programming and flexible processes in software engineering. Citeseer, pp 114–117Google Scholar
  15. Gamma E, Beck K (1999) JUnit: a cook’s tour. Java Rep 4(5):27–38Google Scholar
  16. Grechanik M, Xie Q, Fu C (2009a) Creating GUI testing tools using accessibility technologies. In: International conference on software testing, verification and validation workshops, 2009, ICSTW’09. IEEE, pp 243–250Google Scholar
  17. Grechanik M, Xie Q, Fu C (2009b) Experimental assessment of manual versus tool-based maintenance of GUI-directed test scripts. In: IEEE international conference on software maintenance, ICSM 2009. IEEE, pp 9–18Google Scholar
  18. Grechanik M, Xie Q, Fu C (2009c) Maintaining and evolving GUI-directed test scripts. In: IEEE 31st international conference on software engineering, ICSE 2009. IEEE, pp 408–418Google Scholar
  19. Gutiérrez JJ, Escalona MJ, Mejías M, Torres J (2006) Generation of test cases from functional requirements. A survey. In: 4ş workshop on system testing and validationGoogle Scholar
  20. Hackner DR, Memon AM (2008) Test case generator for GUITAR. In: Companion of the 30th international conference on software engineering. ACM, pp 959–960Google Scholar
  21. Holmes A, Kellogg M (2006) Automating functional tests using selenium, pp 270–275Google Scholar
  22. Horowitz E, Singhera Z (1993) Graphical user interface testing. Tech rep Us C-C S-93-5 4(8)Google Scholar
  23. Hsia P, Gao J, Samuel J, Kung D, Toyoshima Y, Chen C (1994) Behavior-based acceptance testing of software systems: a formal scenario approach. In: Proceedings of the eighteenth annual international on computer software and applications conference, 1994. COMPSAC’94. IEEE, pp 293–298Google Scholar
  24. Hsia P, Kung D, Sell C (1997) Software requirements and acceptance testing. Ann Softw Eng 3(1):291–317CrossRefGoogle Scholar
  25. Itkonen J, Rautiainen K (2005a) Exploratory testing: a multiple case study. In: 2005 international symposium on empirical software engineering, p 10. doi:10.1109/ISESE.2005.1541817Google Scholar
  26. Itkonen J, Rautiainen K (2005b) Exploratory testing: a multiple case study. In: 2005 international symposium on empirical software engineering. IEEE, p 10Google Scholar
  27. Leitner A, Ciupa I, Meyer B, Howard M (2007) Reconciling manual and automated testing: the autotest experience. In: 40th annual Hawaii international conference on system sciences, HICSS 2007. IEEE, pp 261a–261aGoogle Scholar
  28. Lowell C, Stell-Smith J (2003) Successful automation of GUI driven acceptance testing. In: Extreme programming and agile processes in software engineering, pp 1011–1012Google Scholar
  29. Memon A (2002) GUI testing: pitfalls and process. IEEE Comput 35(8):87–88CrossRefGoogle Scholar
  30. Memon A, Banerjee I, Nagarajan A (2003) GUI ripping: reverse engineering of graphical user interfaces for testing. In: Proceedings of the 10th working conference on reverse engineering (WCRE)Google Scholar
  31. Memon AM, Soffa ML (2003) Regression testing of GUIs. In: ACM SIGSOFT software engineering notes, ACM, vol 28, pp 118–127Google Scholar
  32. Miller R, Collins C (2001) Acceptance testing. Proceedings XPUniverseGoogle Scholar
  33. Mongrédien C, Lachapelle G, Cannon M (2006) Testing GPS L5 acquisition and tracking algorithms using a hardware simulator. In: Proceedings of ION GNSS, pp 2901–2913Google Scholar
  34. Myers G, Sandler C, Badgett T (2011) The art of software testing. WileyGoogle Scholar
  35. Olan M (2003) Unit testing: test early, test often. J Comput Sci Coll 19(2):319–328Google Scholar
  36. Onoma A, Tsai W, Poonawala M, Suganuma H (1998) Regression testing in an industrial environment. Commun ACM 41(5):81–86CrossRefGoogle Scholar
  37. Rafi D, Moses K, Petersen K, Mantyla M (2012) Benefits and limitations of automated software testing: systematic literature review and practitioner survey. In: 7th international workshop on automation of software test (AST) 2012, pp 36–42. doi:10.1109/IWAST.2012.6228988Google Scholar
  38. Regnell B, Runeson P (1998) Combining scenario-based requirements with static verification and dynamic testing. In: Proceedings of the fourth international workshop on requirements engineering-foundations for software quality (REFSQ 98). PisaGoogle Scholar
  39. Rothermel G, Untch R, Chu C, Harrold M (2001) Prioritizing test cases for regression testing. IEEE Trans Softw Eng 27(10):929–948CrossRefGoogle Scholar
  40. Runeson P, Höst M (2009) Guidelines for conducting and reporting case study research in software engineering. Empir Softw Eng 14(2):131–164CrossRefGoogle Scholar
  41. Sjösten-Andersson E, Pareto L (2006) Costs and Benefits of Structure-aware Capture/Replay tools. SERPS 06: 3Google Scholar
  42. smartbear (2013) TestComplete. http://smartbear.com/products/qa-tools/automated-testing-toolsGoogle Scholar
  43. TestPlant (2013) eggPlant. http://www.testplant.com/
  44. Vizulis V, Diebelis E (2012) Self-Testing Approach and Testing Tools. Datorzinātne un informācijas tehnolog̀ijas, p 27Google Scholar
  45. Williams L, Kudrjavets G, Nagappan N (2009) On the effectiveness of unit test automation at Microsoft. In: 20th international symposium on software reliability engineering, 2009, ISSRE’09. IEEE, pp 81–89Google Scholar
  46. Yeh T, Chang T, Miller R (2009) Sikuli: using GUI screenshots for search and automation. In: Proceedings of the 22nd annual ACM symposium on user interface software and technology, ACMGoogle Scholar
  47. Zaraket F, Masri W, Adam M, Hammoud D, Hamzeh R, Farhat R, Khamissi E, Noujaim J (2012) GUICOP: Specification-Based GUI Testing. In: IEEE Fifth International Conference on Software Testing, Verification and Validation, 2012 ICST. IEEE, pp 747–751Google Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Department of Computer Science and EngineeringChalmers University of Technology & University of GothenburgGothenburgSweden
  2. 2.Department of Software EngineeringBlekinge Institute of TechnologyKarlskronaSweden

Personalised recommendations