Automated Test Case Generation in End-User Programming

  • Nysret Musliu
  • Wolfgang Slany
  • Johannes Gärtner
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7897)

Abstract

Generation of test cases for end-user programmers is crucial to assure the correctness of their code. In this paper we investigate the automatic generation of test cases for programs that are written in Visual Basic for Applications and are used in MS Excel. We implement a metaheuristic search method to generate tests that achieve a satisfactory statement and branch coverage. Furthermore, in our methodology the code coverage is visualized. The generated test cases and the visualization enable end users to better understand the behavior of the programs and increase the probability of detecting errors when the code is changed at a later time.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Abraham, R., Erwig, M.: Ucheck: A spreadsheet type checker for end users. J. Vis. Lang. Comput. 18(1), 71–95 (2007)CrossRefGoogle Scholar
  2. 2.
    Burnett, M.M., Sheretov, A., Ren, B., Rothermel, G.: Testing homogeneous spreadsheet grids with the ”what you see is what you test” methodology. IEEE Trans. Software Eng. 28(6), 576–594 (2002)CrossRefGoogle Scholar
  3. 3.
    Díaz, E., Tuya, J., Blanco, R., Javier Dolado, J.: A tabu search algorithm for structural software testing. Computers & Operations Research 35(10), 3052–3072 (2008)MATHCrossRefGoogle Scholar
  4. 4.
    Harman, M., Mansouri, S.A., Zhang, Y.: Search-based software engineering: Trends, techniques and applications. ACM Comput. Surv. 45(1), 11 (2012)CrossRefGoogle Scholar
  5. 5.
    Ko, A.J., Abraham, R., Beckwith, L., Blackwell, A.F., Burnett, M.M., Erwig, M., Scaffidi, C., Lawrance, J., Lieberman, H., Myers, B.A., Rosson, M.B., Rothermel, G., Shaw, M., Wiedenbeck, S.: The state of the art in end-user software engineering. ACM Comput. Surv. 43(3), 21 (2011)CrossRefGoogle Scholar
  6. 6.
    Lawrance, J., Clarke, S., Burnett, M.M., Rothermel, G.: How well do professional developers test with code coverage visualizations? An empirical study. In: VL/HCC, pp. 53–60 (2005)Google Scholar
  7. 7.
    Panko, R.: Spreadsheet errors: What we know. What we think we can do. arXiv preprint arXiv:0802.3457 (2008)Google Scholar
  8. 8.
    Rothermel, G., Li, L., DuPuis, C., Burnett, M.M.: What you see is what you test: A methodology for testing form-based visual programs. In: ICSE, pp. 198–207 (1998)Google Scholar
  9. 9.
    Ruthruff, J.R., Prabhakararao, S., Reichwein, J., Cook, C.R., Creswick, E., Burnett, M.M.: Interactive, visual fault localization support for end-user programmers. J. Vis. Lang. Comput. 16(1-2), 3–40 (2005)CrossRefGoogle Scholar
  10. 10.
    Tracey, N., Clark, J., Mander, K.: Automated program flaw finding using simulated annealing. ACM SIGSOFT Soft. Eng. Notes 23, 73–81 (1998)CrossRefGoogle Scholar
  11. 11.
    Waeselynck, H., Thévenod-Fosse, P., Abdellatif-Kaddour, O.: Simulated annealing applied to test generation: landscape characterization and stopping criteria. Emp. Soft. Eng. 12(1), 35–63 (2007)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Nysret Musliu
    • 1
  • Wolfgang Slany
    • 2
  • Johannes Gärtner
    • 3
  1. 1.DBAITechnische Universität WienAustria
  2. 2.ISTTechnische Universität GrazAustria
  3. 3.Ximes GmbHAustria

Personalised recommendations