Ariadne: Evolving Test Data Using Grammatical Evolution

  • Muhammad Sheraz AnjumEmail author
  • Conor Ryan
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11451)


Software testing is a key component in software quality assurance; it typically involves generating test data that exercises all instructions and tested conditions in a program and, due to its complexity, can consume as much as 50% of overall software development budget. Some evolutionary computing techniques have been successfully applied to automate the process of test data generation but no existing techniques exploit variable interdependencies in the process of test data generation, even though several studies from the software testing literature suggest that the variables examined in the branching conditions of real life programs are often interdependent on each other, for example, if (x == y), etc.

We propose the Ariadne system which uses Grammatical Evolution (GE) and a simple Attribute Grammar to exploit the variable interdependencies in the process of test data generation. Our results show that Ariadne dramatically improves both effectiveness and efficiency when compared with existing techniques based upon well-established criteria, attaining coverage (the standard software testing success metric for these sorts of problems) of 100% on all benchmarks with far fewer program evaluations (often between a third and a tenth of other systems).


Automatic test case generation Code coverage Evolutionary testing Grammatical Evolution Variable interdependencies 



The authors would like to thank Muhammad Hamad Khan for his help with the graphic designs. This work is supported by Lero, the Irish Software Research Centre, and the Science Foundation of Ireland.


  1. 1.
    Myers, G.J., Sandler, C., Badgett, T.: The Art of Software Testing. Wiley, Hoboken (2011)Google Scholar
  2. 2.
    Beizer, B.: Software Testing Techniques, 2nd edn. Van Nostrand Reinhold Inc., New York (1990). ISBN 0-442-20672-0zbMATHGoogle Scholar
  3. 3.
    Sauder, R.L.: A general test data generator for COBOL. In: Proceedings of the 1–3 May 1962, Spring Joint Computer Conference, pp. 317–323. ACM (1962)Google Scholar
  4. 4.
    McMinn, P.: Search-based software test data generation: a survey. Softw. Test. Verif. Reliab. 14(2), 105–156 (2004)CrossRefGoogle Scholar
  5. 5.
    Ali, S., Briand, L.C., Hemmati, H., Panesar-Walawege, R.K.: A systematic review of the application and empirical investigation of search-based test case generation. IEEE Trans. Softw. Eng. 36(6), 742–762 (2010)CrossRefGoogle Scholar
  6. 6.
    Holland, J.H.: Genetic algorithms. Sci. Am. 267(1), 66–73 (1992)CrossRefGoogle Scholar
  7. 7.
    Aleti, A., Buhnova, B., Grunske, L., Koziolek, A., Meedeniya, I.: Software architecture optimization methods: a systematic literature review. IEEE Trans. Softw. Eng. 39(5), 658–683 (2013)CrossRefGoogle Scholar
  8. 8.
    Michael, C.C., McGraw, G., Schatz, M.A.: Generating software test data by evolution. IEEE Trans. Softw. Eng. 12, 1085–1110 (2001)CrossRefGoogle Scholar
  9. 9.
    Miller, J., Reformat, M., Zhang, H.: Automatic test data generation using genetic algorithm and program dependence graphs. Inf. Softw. Technol. 48(7), 586–605 (2006)CrossRefGoogle Scholar
  10. 10.
    Harman, M., McMinn, P.: A theoretical and empirical study of search-based testing: local, global, and hybrid search. IEEE Trans. Softw. Eng. 36(2), 226–247 (2010)CrossRefGoogle Scholar
  11. 11.
    Cohen, E.I.: A finite domain-testing strategy for computer program testing. Ph.D. thesis, The Ohio State University (1978)Google Scholar
  12. 12.
    Elshoff, J.L.: An analysis of some commercial PL/I programs. IEEE Trans. Softw. Eng. 2, 113–120 (1976)CrossRefGoogle Scholar
  13. 13.
    Ryan, C., Collins, J.J., Neill, M.O.: Grammatical evolution: evolving programs for an arbitrary language. In: Banzhaf, W., Poli, R., Schoenauer, M., Fogarty, T.C. (eds.) EuroGP 1998. LNCS, vol. 1391, pp. 83–96. Springer, Heidelberg (1998). Scholar
  14. 14.
    O’Neill, M., Ryan, C.: Grammatical evolution. IEEE Trans. Evol. Comput. 5(4), 349–358 (2001)CrossRefGoogle Scholar
  15. 15.
    Harman, M., Jia, Y., Zhang, Y.: Achievements, open problems and challenges for search based software testing. In: 2015 IEEE 8th International Conference on Software Testing, Verification and Validation (ICST), pp. 1–12. IEEE (2015)Google Scholar
  16. 16.
    Boyer, R.S., Elspas, B., Levitt, K.N.: Selecta formal system for testing and debugging programs by symbolic execution. ACM SIGPLAN Not. 10(6), 234–245 (1975)CrossRefGoogle Scholar
  17. 17.
    Miller, W., Spooner, D.L.: Automatic generation of floating-point test data. IEEE Trans. Softw. Eng. 3, 223–226 (1976)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Clarke, L.A.: A system to generate test data and symbolically execute programs. IEEE Trans. Softw. Eng. 3, 215–222 (1976)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Ramamoorthy, C.V., Ho, S.B., Chen, W.: On the automated generation of program test data. IEEE Trans. Softw. Eng. 4, 293–300 (1976)CrossRefGoogle Scholar
  20. 20.
    Offutt, A.J.: An integrated automatic test data generation system. In: Yeh, R.T. (ed.) Case Technology, pp. 129–147. Springer, Boston (1991). Scholar
  21. 21.
    DeMilli, R., Offutt, A.J.: Constraint-based automatic test data generation. IEEE Trans. Softw. Eng. 17(9), 900–910 (1991)CrossRefGoogle Scholar
  22. 22.
    Offutt, A.J., Jin, Z., Pan, J.: The dynamic domain reduction procedure for test data generation. Softw. Pract. Exp. 29(2), 167–193 (1999)CrossRefGoogle Scholar
  23. 23.
    Korel, B.: Automated software test data generation. IEEE Trans. Softw. Eng. 16(8), 870–879 (1990)CrossRefGoogle Scholar
  24. 24.
    Korel, B.: Dynamic method for software test data generation. Softw. Test. Verif. Reliab. 2(4), 203–213 (1992)CrossRefGoogle Scholar
  25. 25.
    Korel, B.: Automated test data generation for programs with procedures. In: ACM SIGSOFT Software Engineering Notes, vol. 21, pp. 209–215. ACM (1996)Google Scholar
  26. 26.
    Ferguson, R., Korel, B.: The chaining approach for software test data generation. ACM Trans. Softw. Eng. Methodol. (TOSEM) 5(1), 63–86 (1996)CrossRefGoogle Scholar
  27. 27.
    Tracey, N., Clark, J., Mander, K., McDermid, J.: An automated framework for structural test-data generation. In: ASE, p. 285. IEEE (1998)Google Scholar
  28. 28.
    Tracey, N., Clark, J.A., Mander, K.: The way forward for unifying dynamic test-case generation: the optimisation-based approach. In: Proceedings of the IFIP International Workshop on Dependable Computing and Its Applications (DCIA), York (1998)Google Scholar
  29. 29.
    Xanthakis, S., Ellis, C., Skourlas, C., Le Gall, A., Katsikas, S., Karapoulios, K.: Application of genetic algorithms to software testing. In: Proceedings of the 5th International Conference on Software Engineering and Applications, pp. 625–636 (1992)Google Scholar
  30. 30.
    Jones, B.F., Sthamer, H.H., Eyres, D.E.: Automatic structural testing using genetic algorithms. Softw. Eng. J. 11(5), 299–306 (1996)CrossRefGoogle Scholar
  31. 31.
    Pargas, R.P., Harrold, M.J., Peck, R.R.: Test-data generation using genetic algorithms. Softw. Test. Verif. Reliab. 9(4), 263–282 (1999)CrossRefGoogle Scholar
  32. 32.
    Wegener, J., Baresel, A., Sthamer, H.: Evolutionary test environment for automatic structural testing. Inf. Softw. Technol. 43(14), 841–854 (2001)CrossRefGoogle Scholar
  33. 33.
    Arcuri, A., Yao, X.: Search based software testing of object-oriented containers. Inf. Sci. 178(15), 3075–3095 (2008)CrossRefGoogle Scholar
  34. 34.
    Fraser, G., Arcuri, A., McMinn, P.: A memetic algorithm for whole test suite generation. J. Syst. Softw. 103, 311–327 (2015)CrossRefGoogle Scholar
  35. 35.
    Fraser, G., Arcuri, A.: Whole test suite generation. IEEE Trans. Softw. Eng. 39(2), 276–291 (2013)CrossRefGoogle Scholar
  36. 36.
    Panichella, A., Kifetew, F.M., Tonella, P.: Reformulating branch coverage as a many-objective optimization problem. In: 2015 IEEE 8th International Conference on Software Testing, Verification and Validation (ICST), pp. 1–10. IEEE (2015)Google Scholar
  37. 37.
    Karim, M.R., Ryan, C.: Sensitive ants are sensible ants. In: Proceedings of the 14th Annual Conference on Genetic and Evolutionary Computation, pp. 775–782. ACM (2012)Google Scholar
  38. 38.
    bibclean.c (1995). Accessed 09 Nov 2018
  39. 39.
    Reeves, C.R., Rowe, J.E.: Genetic Algorithms Principles and Presentation: A Guide to GA Theory. Springer, New York (2002). Scholar
  40. 40.
    Mitchell, M., Forrest, S., Holland, J.H.: The royal road for genetic algorithms: fitness landscapes and GA performance. In: Proceedings of the First European Conference on Artificial Life, pp. 245–254 (1992)Google Scholar
  41. 41.
    DeMillo, R.A., Offutt, A.J.: Experimental results from an automatic test case generator. ACM Trans. Softw. Eng. Methodol. (TOSEM) 2(2), 109–127 (1993)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of Computer Science and Information SystemsUniversity of LimerickCastletroyIreland

Personalised recommendations