Cluster Computing

, Volume 21, Issue 1, pp 729–739 | Cite as

Parallel testing and coverage analysis for context-free applications

  • Abdul RaufEmail author
  • Muhammad Ramzan


Software testing being one of the major phases of the software development life cycle is critical in delivering reliable software products. Traditional manual GUI testing has severe limitations like insufficient test coverage, labor intensiveness, complexity involved and cost ineffectiveness. “How much testing is enough or sufficient?” still remains a challenging question. Coverage analysis helps to guide the test engineers regarding test coverage and is used extensively to determine the effectiveness of selected testing methodology. The problems mentioned above related to manual GUI testing prompted the need for automation of GUI testing and coverage analysis. With the rapid emergence of GUI based context free applications, automated testing tools seems less effective. Recently, it has been learnt that concurrent and distributed machines based cluster can be used to reduce the required effort to test GUI in context of time. Proposed system use two different evolutionary algorithms concurrently to gain multiple objectives, based on the fitness functions of maximization of GUI test path coverage (measured as the number of events in an event sequence tested by a particular test case) and simultaneously minimizing the number of test cases. For coverage analysis, two of the very well-known multi objective algorithms; NSGA II and MOPSO from evolutionary domain have been employed. Experiments results of coverage analysis show that accuracy to the tune of 85% was achieved in MOPSO and NSGA II. This high level of accuracy is an indicator of usefulness of proposed fitness function. Both algorithms execute more than 90% of test paths.


GUI testing Test automation Coverage analysis NSGA-II Context free applications testing Parallel testing 


  1. 1.
    Ng, S.P., Murnane, T., Reed, K., Grant, D., Chen, T.Y.: A preliminary survey on software testing practices in Australia. Proceedings 2004, 116–125 (2004). doi: 10.1109/ASWEC.2004.1290464 Google Scholar
  2. 2.
    Emer, M.C.F.P., Vergilio, S.R.: Selection and evaluation of test data based on genetic programming. Softw. Qual. J. 11(2), 167–186 (2003)CrossRefGoogle Scholar
  3. 3.
    Hackner, D.R., Memon, A.M.: Test case generator for GUITAR. In: Companion of the 30th International Conference on Software Engineering. ACM, New York (2008)Google Scholar
  4. 4.
  5. 5.
    Rauf, A., Aleisa, E.A.: PSO based automated test coverage analysis of event driven systems. Intell. Autom. Soft Comput. 21(4), 491–502 (2015)CrossRefGoogle Scholar
  6. 6.
    Rauf, A., et al.: Evolutionary based automated coverage analysis for GUI testing. In: International Conference on Contemporary Computing. Springer, Berlin (2010)Google Scholar
  7. 7.
    Coello, C.A.C.: Evolutionary multi-objective optimization: a historical view of the field. IEEE Comput. Intell. Mag. 1(1), 28–36 (2006)CrossRefGoogle Scholar
  8. 8.
    Wegener, J., et al.: Testing real-time systems using genetic algorithms. Softw. Qual. J. 6(2), 127–135 (1997)CrossRefGoogle Scholar
  9. 9.
    Korel, B.: Automated software test data generation. IEEE Trans. Softw. Eng. 16(8), 870–879 (1990)CrossRefGoogle Scholar
  10. 10.
    McMinn, P., et al.: The species per path approach to search based test data generation. In: Proceedings of the 2006 International Symposium on Software Testing and Analysis. ACM, New York (2006)Google Scholar
  11. 11.
    Tonella, P.: Evolutionary testing of classes. In: ACM SIGSOFT Software Engineering Notes, vol. 29, p. 4. ACM, New York (2004)Google Scholar
  12. 12.
    Pargas, R.P., Harrold, M.J., Peck, R.R.: Test-data generation using genetic algorithms. Softw. Test. Verif. Reliab. 9(4), 263–282 (1999)CrossRefGoogle Scholar
  13. 13.
    Kasik, D.J., George, H.G.: Toward automatic generation of novice user test scripts. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York (1996)Google Scholar
  14. 14.
    White, L., Almezen, H.: Generating test cases for GUI responsibilities using complete interaction sequences. In: Proceedings of the 11th International Symposium on Software Reliability Engineering, 2000. ISSRE 2000. IEEE, New York (2000)Google Scholar
  15. 15.
    Memon, A.M., Xie, Q.: Studying the fault-detection effectiveness of GUI test cases for rapidly evolving software. IEEE Trans. Softw. Eng. 31(10), 884–896 (2005)CrossRefGoogle Scholar
  16. 16.
    Memon, A.M., Pollack, M.E., Soffa, M.L.: Hierarchical GUI test case generation using automated planning. IEEE Trans. Softw. Eng. 27(2), 144–155 (2001)CrossRefGoogle Scholar
  17. 17.
    Memon, A.M.: An event-flow model of GUI-based applications for testing. Softw. Test. Verif. Reliab. 17(3), 137–158 (2007)CrossRefGoogle Scholar
  18. 18.
    Windisch, A., Wappler, S., Wegener, J.: Applying particle swarm optimization to software testing. In: Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation. ACM, New York (2007)Google Scholar
  19. 19.
    Hla, K.H.S., Choi, Y.S., Park, J.S.: Applying particle swarm optimization to prioritizing test cases for embedded real time software retesting. In: IEEE 8th International Conference on Computer and Information Technology Workshops, 2008. CIT Workshops 2008. IEEE, New York (2008)Google Scholar
  20. 20.
    Singla, T., Kumar, A., Garhwal, S.: Reducing mutation testing endeavor using the similar conditions for the same mutation operators occurs at different locations. Appl. Math. 8(5), 2389–2393 (2014)Google Scholar
  21. 21.
    Lu, Y., et al.: Development of an improved GUI automation test system based on event-flow graph. In: 2008 International Conference on Computer Science and Software Engineering, vol. 2. IEEE, New York (2008)Google Scholar
  22. 22.
    Lakhotia, K., Harman, M., McMinn, P.: A multi-objective approach to search-based test data generation. In: Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation. ACM, New York (2007)Google Scholar
  23. 23.
    Yoo, S., Harman, M.: Pareto efficient multi-objective test case selection. In: Proceedings of the 2007 International Symposium on Software Testing and Analysis. ACM, New York (2007)Google Scholar
  24. 24.
    Langdon, W.B., Harman, M., Jia, Y.: Multi objective higher order mutation testing with genetic programming. In: Testing: Academic and Industrial Conference-Practice and Research Techniques, 2009. TAIC PART’09. IEEE, New York (2009)Google Scholar
  25. 25.
    Rauf, A., et al.: Automated GUI test coverage analysis using GA. In: 2010 Seventh International Conference on Information Technology: New Generations (ITNG). IEEE, New York (2010)Google Scholar
  26. 26.
    Armada Assault ETC. Accessed 13 June 2017
  27. 27.
    Coello, C.A.C., Lamont, G.B.: Applications of Multi-objective Evolutionary Algorithms. World Scientific, Singapore (2004)CrossRefzbMATHGoogle Scholar
  28. 28.
    Coello, C.A.C.: Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: a survey of the state of the art. Comput. Methods Appl. Mech. Eng. 191(11), 1245–1287 (2002)Google Scholar
  29. 29.
    David, A.V.V.: Multiobjective evolutionary algorithms: classifications, analyses, and new innovations. PhD dissertation, Air Force Institute of Technology, Wright Patterson AFB, OH. Advisor(s) Gary B. Lamont. AAI9928483Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2017

Authors and Affiliations

  1. 1.Department of Computer Science, College of Computer & Information SciencesAl-Imam Mohammad Ibn Saud Islamic University (IMSIU)RiyadhKingdom of Saudi Arabia
  2. 2.Department of Computer Science, College of Computer ScienceSaudi Electronic UniversityRiyadhKingdom of Saudi Arabia

Personalised recommendations