From Faults Via Test Purposes to Test Cases: On the Fault-Based Testing of Concurrent Systems

  • Bernhard K. Aichernig
  • Carlo Corrales Delgado
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3922)


Fault-based testing is a technique where testers anticipate errors in a system under test in order to assess or generate test cases. The idea is to have enough test cases capable of detecting these anticipated errors. This paper presents a theory and technique for generating fault-based test cases for concurrent systems. The novel idea is to generate test purposes from faults that have been injected into a model of the system under test. Such test purposes form a specification of a more detailed test case that can detect the injected fault. The theory is based on the notion of refinement. The technique is automated using the TGV test case generator and an equivalence checker of the CADP tools. A case study of testing web servers demonstrates the practicability of the approach.


Model Checker Mutation Testing Testing Theory Test Purpose Label Transition System 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Gaudel, M.C.: Testing can be formal, too. In: Mosses, P.D., Schwartzbach, M.I., Nielsen, M. (eds.) CAAP 1995, FASE 1995, and TAPSOFT 1995. LNCS, vol. 915, pp. 82–96. Springer, Heidelberg (1995)CrossRefGoogle Scholar
  2. 2.
    Jéron, T., Morel, P.: Test Generation Derived from Model-checking. In: Halbwachs, N., Peled, D.A. (eds.) CAV 1999. LNCS, vol. 1633, Springer, Heidelberg (1999)CrossRefGoogle Scholar
  3. 3.
    Fernandez, J.C., Garavel, H., Kerbrat, A., Mounier, L., Mateescu, R., Sighireanu, M.: CADP: a protocol validation and verification toolbox. In: Alur, R., Henzinger, T.A. (eds.) Proceedings of the Eighth International Conference on Computer Aided Verification CAV, vol. 1102, pp. 437–440. Springer, Heidelberg (1996)Google Scholar
  4. 4.
    Tretmans, J.: Test Generation with inputs, outputs and repetitive quiescence. Software: Concepts and Tools 17, 103–120 (1996)MATHGoogle Scholar
  5. 5.
    ISO: ISO/IEC 9646-3: Information technology - OSI - Conformance testing methodology and framework - Part 3: The Tree and Tabular Combined Notation (TTCN). Technical report, (1998)Google Scholar
  6. 6.
    ISO: ISO/IEC 9646-1: Information technology - OSI - Conformance testing methodology and framework - Part 1: General Concepts. Technical report, (1994)Google Scholar
  7. 7.
    Grabowski, J., Hogrefe, D., Nahm, R.: Test case generation with test purpose specification by MSC’s. In: SDL 1993, the 6th SDL Forum, pp. 253–266. Elsevier Science, Amsterdam (1993)Google Scholar
  8. 8.
    de Vries, R.G., Tretmans, J.: Towards formal test purposes. In: Brinksma, E., Tretmans, J. (eds.) Proceedings of the Workshop on Formal Approaches to Testing of Software (FATES 2001), Aalborg, Denmark, August 25, 2001. Number NS-01-4 in BRICS Notes Series, pp. 61–76. BRICS, Department of Computer Science, University of Aarhus (2001)Google Scholar
  9. 9.
    Grieskamp, W., Tillmann, N., Campbell, C., Schulte, W., Veanes, M.: Action Machines — Towards a Framework for Model Composition, Exploration and Conformance Testing Based on Symbolic Computation. In: Cai, K.Y., Ohnishi, A., Lau, M. (eds.) QSIC 2005, Proceedings of the Fifth International Conference on Quality Software, Melbourne, Australia, September 19–20, 2005, pp. 72–79. IEEE Computer Society Press, Los Alamitos (2005)Google Scholar
  10. 10.
    Ledru, Y., du Bousquet, L., Bontron, P., Maury, O., Oriat, C., Potet, M.L.: Test purposes: adapting the notion of specification to testing. In: Feather, M., Goedicke, M. (eds.) Proceedings of the 16th IEEE International Conference on Automated Software Engineering (ASE 2001), San Diego, CA, USA, 26–29 November 2001, pp. 127–134. IEEE Computer Society Press, Los Alamitos (2001)CrossRefGoogle Scholar
  11. 11.
    Hamlet, R.G.: Testing programs with the aid of a compiler. IEEE Transactions on Software Engineering 3, 279–290 (1977)CrossRefMATHGoogle Scholar
  12. 12.
    DeMillo, R., Lipton, R., Sayward, F.: Hints on test data selection: Help for the practicing programmer. IEEE Computer 11, 34–41 (1978)CrossRefGoogle Scholar
  13. 13.
    Wong, W.E. (ed.): Mutation Testing for the New Century. Kluwer Academic Publishers, Dordrecht (2001)Google Scholar
  14. 14.
    Bernot, G., Gaudel, M.C., Marre, B.: Software testing based on formal specifications: A theory and a tool. Software Engineering Journal 6, 387–405 (1991)CrossRefGoogle Scholar
  15. 15.
    Dick, J., Faivre, A.: Automating the generation and sequencing of test cases from model-based specifications. In: Larsen, P.G., Woodcock, J.C.P. (eds.) FME 1993. LNCS, vol. 670, pp. 268–284. Springer, Heidelberg (1993)CrossRefGoogle Scholar
  16. 16.
    Stocks, P.A.: Applying formal methods to software testing. PhD thesis, Department of computer science, University of Queensland (1993)Google Scholar
  17. 17.
    Aichernig, B.K.: Mutation Testing in the Refinement Calculus. Formal Aspects of Computing Journal 15, 280–295 (2003)CrossRefMATHGoogle Scholar
  18. 18.
    Aichernig, B.K., Salas, P.A.P.: Test case generation by OCL mutation and constraint solving. In: Cai, K.Y., Ohnishi, A., Lau, M. (eds.) QSIC 2OO5, Proceedings of the Fifth International Conference on Quality Software, Melbourne, Australia, September 19-21, 2005, pp. 64–71. IEEE Computer Society Press, Los Alamitos (2005)CrossRefGoogle Scholar
  19. 19.
    Fernandez, J.C.: Aldébaran: A tool for verification of communicating processes. Technical report, Technical Report Spectre C14, LGJ-IMAG Grenoble (1989)Google Scholar
  20. 20.
    Black, P., Okun, V., Yesha, Y.: Mutation operators for specifications. In: Proceedings of 15th IEEE International Conference on Automated Software Engineering (ASE 2000), pp. 81–88 (2000)Google Scholar
  21. 21.
    Srivatanakul, T., Clark, J., Stepney, S., Polack, F.: Challenging formal specifications by mutation: a CSP security example. In: Proceedings of APSEC 2003: 10th Asia-Pacific Software Engineering Conference, Chiang Mai, Thailand, December, 2003, pp. 340–351. IEEE, Los Alamitos (2003)CrossRefGoogle Scholar
  22. 22.
    Offutt, J., Lee, A., Rothermel, G., Untch, R.H., Zapf, C.: An experimental determination of sufficient mutant operators. ACM Transactions on Software Engineering and Methodology 5, 99–118 (1996)CrossRefGoogle Scholar
  23. 23.
    Budd, T., Gopal, A.: Program testing by specification mutation. Comput. Lang. 10, 63–73 (1985)CrossRefMATHGoogle Scholar
  24. 24.
    Tai, K.C., Su, H.K.: Test generation for Boolean expressions. In: Proceedings of the Eleventh Annual International Computer Software and Applications Conference (COMPSAC), pp. 278–284 (1987)Google Scholar
  25. 25.
    Tai, K.C.: Theory of fault-based predicate testing for computer programs. IEEE Transactions on Software Engineering 22, 552–562 (1996)CrossRefGoogle Scholar
  26. 26.
    Woodward, M.: Errors in algebraic specifications and an experimental mutation testing tool. Software Engineering Journal, 211–224 (1993)Google Scholar
  27. 27.
    Burton, S.: Automated Testing from Z Specifications. Technical Report YCS 329, Department of Computer Science, University of York (2000)Google Scholar
  28. 28.
    Black, P., Okun, V., Yesha, Y.: Mutation of model checker specifications for test generation and evaluation. In: Mutation testing for the new century, pp. 14–20. Kluwer Academic Publishers, Dordrecht (2001)CrossRefGoogle Scholar
  29. 29.
    de Souza, S., Fabbri, J.M.S., de Souza, W.: Mutation testing applied to Estelle specifications. Software Quality Journal 8, 285–301 (1999)CrossRefGoogle Scholar
  30. 30.
    Wimmel, G., Jürjens, J.: Specification-based test generation for security-critical systems using mutations. In: George, C.W., Miao, H. (eds.) ICFEM 2002. LNCS, vol. 2495, pp. 471–482. Springer, Heidelberg (2002)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Bernhard K. Aichernig
    • 1
  • Carlo Corrales Delgado
    • 1
  1. 1.International Institute for Software TechnologyUnited Nations UniversityMacau SARChina

Personalised recommendations