Skip to main content

30 Years of Automated GUI Testing: A Bibliometric Analysis

Part of the Communications in Computer and Information Science book series (CCIS,volume 1439)

Abstract

Context: Over the last 30 years, GUIs have changed considerably, becoming everyday part of our lives through smart phones and other devices. More complex GUIs and multitude of platforms have increased the challenges when testing software through the GUI. Objective: To visualise how the field of automated GUI testing has evolved by studying the growth of the field; types of publications; influential events, papers and authors; collaboration among authors; and trends on GUI testing. Method: To conduct a bibliometric analysis of automated GUI testing by performing a systematic search of primary studies in Scopus from 1990 to 2020. Results: 744 publications were selected as primary studies. The majority of them were conference papers, the most cited paper was published on 2013, and the most published author has 53 papers. Conclusions: Automated GUI testing has continuously grown. Keywords show that testing applied to mobile interfaces will be the trend in next years, along with the integration of Artificial Intelligence and automated exploration techniques.

Keywords

  • Automated testing
  • Graphical user interface
  • Bibliometric analysis
  • Secondary study

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-85347-1_34
  • Chapter length: 16 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   109.00
Price excludes VAT (USA)
  • ISBN: 978-3-030-85347-1
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   139.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.
Fig. 7.
Fig. 8.

Notes

  1. 1.

    https://www.bibliometrix.org/Biblioshiny.html.

  2. 2.

    https://gui-testing-repository.testar.org/keywords.

  3. 3.

    https://gui-testing-repository.testar.org.

References

  1. Banerjee, I., Nguyen, B., Garousi, V., Memon, A.: Graphical user interface testing: systematic mapping and repository. IST 55(10), 1679–1694 (2013)

    Google Scholar 

  2. Barr, E.T., Harman, M., McMinn, P., Shahbaz, M., Yoo, S.: The oracle problem in software testing: a survey. TSE 41(5), 507–525 (2015)

    Google Scholar 

  3. Bradford, S.C.: Sources of information on specific subjects. Engineering 137, 85–86 (1934)

    Google Scholar 

  4. Navarrete, C.B., Malverde, M.G., Lagos, P.S., Mujica, A.D.: Buhos: a web-based systematic literature review management software. SoftwareX 7, 360–372 (2018)

    CrossRef  Google Scholar 

  5. Chen, G., Xiao, L.: Selecting publication keywords for domain analysis in bibliometrics: a comparison of three methods. J. Informetr. 10, 212–223 (2016)

    MathSciNet  CrossRef  Google Scholar 

  6. Cobo, M., López-Herrera, A., Herrera-Viedma, E., Herrera, F.: Science mapping software tools: review, analysis, and cooperative study among tools. J. Am. Soc. Inform. Sci. Technol. 62(7), 1382–1402 (2011)

    CrossRef  Google Scholar 

  7. Franceschet, M.: The role of conference publications in CS. Commun. ACM 53(12), 129–132 (2010)

    CrossRef  Google Scholar 

  8. Johnson, M.: Automated testing of user interfaces. In: Pacific North West Software Quality Conference, pp. 285–293 (1987)

    Google Scholar 

  9. Leimkuhler, F.: An exact formulation of Bradford’s law. J. Doc. (1980)

    Google Scholar 

  10. Loper, E., Bird, S.: NLTK: the natural language toolkit. arXiv:0205028 (2002)

  11. Lotka, A.J.: The frequency distribution of scientific productivity. J. Wash. Acad. Sci. 16(12), 317–323 (1926)

    Google Scholar 

  12. Martín-Martín, A., Orduna-Malea, E., Thelwall, M., Delgado López-Cózar, E.: Google scholar, web of science, and scopus: a systematic comparison of citations in 252 subject categories. J. Informetr. 12(4), 1160–1177 (2018)

    CrossRef  Google Scholar 

  13. Marx, W., Bornmann, L., Barth, A., Leydesdorff, L.: Detecting the historical roots of research fields by reference publication year spectroscopy (RPYS). J. Am. Soc. Inf. Sci. 65(4), 751–764 (2014)

    Google Scholar 

  14. Memon, A.M.: A comprehensive framework for testing graphical user interfaces. Ph.D. (2001). Advisors: Mary Lou Soffa and Martha Pollack; Committee members: Prof. Rajiv Gupta (University of Arizona), Prof. Adele E. Howe (Colorado State University), Prof. Lori Pollock (University of Delaware)

    Google Scholar 

  15. Paulos, E.: The rise of the expert amateur: DIY culture and citizen science. In: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, pp. 181–182 (2009)

    Google Scholar 

  16. Rousseau, B., Rousseau, R.: LOTKA: a program to fit a power law distribution to observed frequency data. Cybern. Int. J. Scientometr. Informetr. Bibliometr. (4), 4 (2000)

    Google Scholar 

  17. Small, H.: Visualizing science by citation mapping. J. Am. Soc. Inf. Sci. 50(9), 799–813 (1999)

    CrossRef  Google Scholar 

  18. Su, H.N., Lee, P.C.: Mapping knowledge structure by keyword co-occurrence: a first look at journal papers in Technology Foresight. Scientometrics 85(1), 65–79 (2010). https://doi.org/10.1007/s11192-010-0259-8

    CrossRef  Google Scholar 

  19. Thor, A., Marx, W., Leydesdorff, L., Bornmann, L.: Introducing CitedReferencesExplorer (CRExplorer): a program for reference publication year spectroscopy with cited references standardization. J. Informet. 10(2), 503–515 (2016)

    CrossRef  Google Scholar 

  20. Vieira, E.S., Gomes, J.A.N.F.: A comparison of scopus and web of science for a typical university. Scientometrics 81(2), 587–600 (2009)

    CrossRef  Google Scholar 

  21. Wohlin, C.: Guidelines for snowballing in systematic literature studies and a replication in software engineering. In: Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering, pp. 1–10 (2014)

    Google Scholar 

Download references

Acknowledgements

We thank Fernando Pastor for his valuable contribution. This research has been funded by DECODER (decoder-project.eu), iv4XR (iv4xr-project.eu), and IVVES (ivves.weebly.com) projects.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Olivia Rodríguez-Valdés .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Rodríguez-Valdés, O., Vos, T.E.J., Aho, P., Marín, B. (2021). 30 Years of Automated GUI Testing: A Bibliometric Analysis. In: Paiva, A.C.R., Cavalli, A.R., Ventura Martins, P., Pérez-Castillo, R. (eds) Quality of Information and Communications Technology. QUATIC 2021. Communications in Computer and Information Science, vol 1439. Springer, Cham. https://doi.org/10.1007/978-3-030-85347-1_34

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-85347-1_34

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-85346-4

  • Online ISBN: 978-3-030-85347-1

  • eBook Packages: Computer ScienceComputer Science (R0)