Advertisement

Gast: Generic Automated Software Testing

  • Pieter Koopman
  • Artem Alimarine
  • Jan Tretmans
  • Rinus Plasmeijer
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2670)

Abstract

Software testing is a labor-intensive, and hence expensive, yet heavily used technique to control quality. In this paper we introduce Gast, a fully automatic test tool. Properties about functions and datatypes can be expressed in first order logic. Gast automatically and systematically generates appropriate test data, evaluates the property for these values, and analyzes the test results. This makes it easier and cheaper to test software components. The distinguishing property of our system is that the test data are generated in a systematic and generic way using generic programming techniques. This implies that there is no need for the user to indicate how data should be generated. Moreover, duplicated tests are avoided, and for finite domains Gast is able to prove a property by testing it for all possible values. As an important side-effect, it also encourages stating formal properties of the software.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    A. Alimarine and R. Plasmeijer A Generic Programming Extension for Clean. IFL2001, LNCS 2312, pp.168–185, 2001.Google Scholar
  2. 2.
    G. Bernot, M.C. Gaudel, B. Marre: Software Testing based on Formal Specifications: a theory and a tool, Software Engineering Journal, 6(6), pp 287–405, 1991.CrossRefGoogle Scholar
  3. 3.
    K. Claessen, J. Hughes. QuickCheck: A lightweight Tool for Random Testing of Hasskell Programs. International Conference on Functional Programming, ACM, pp 268–279, 2000. See also http://www.cs.chalmers.se/~rjmh/QuickCheck.
  4. 4.
    K. Claessen, J. Hughes. Testing Monadic Code with QuickCheck, Haskell Workshop, 2002.Google Scholar
  5. 5.
    M. van Eekelen, M. de Mol: Reasoning about explicit strictness in a lazy language using mixed lazy/strict semantics, Draft proceedings IFL2002, Report 127-02, Computer Science, Universidad Complutense de Madrid, pp 357–373, 2002.Google Scholar
  6. 6.
    R. Hinze and J. Jeuring. Generic Haskell: Practice and Theory, Summer School on Generic Programming, 2002.Google Scholar
  7. 7.
    R. Hinze, and S. Peyton Jones Derivable Type Classes, Proceedings of the Fourth Haskell Workshop, Montreal Canada, 2000.Google Scholar
  8. 8.
    Hinze, R. Polytypic values possess polykinded types, Fifth International Conference on Mathematics of Program Construction, LNCS 1837, pp 2–27, 2000.Google Scholar
  9. 9.
    HUnit home page: hunit.sourceforge.netGoogle Scholar
  10. 10.
    S. Peyton Jones, J. Hughes: Report on the programming language Haskell 98 — A Non-strict, Purely Functional Language, 2002 http://www.haskell.org/onlinereport.
  11. 11.
    JUint home page: junit.sourceforge.netGoogle Scholar
  12. 12.
    M. de Mol, M. van Eekelen, R. Plasmeijer. Theorem Proving for Functional Programmers, LNCS 2312, pp 55–72, 2001. See also http://www.cs.kun.nl/Sparkle.Google Scholar
  13. 13.
    Graeme E. Moss and Colin Runciman. Inductive benchmarking for purely functional data structures, Journal of Functional Programming, 11(5): pp 525–556, 2001. Auburn home page: http://www.cs.york.ac.uk/fp/auburn zbMATHCrossRefMathSciNetGoogle Scholar
  14. 14.
    Rinus Plasmeijer and Marko van Eekelen: Concurrent Clean Language Report (version 2.0), 2002. http://www.cs.kun.nl/~clean.
  15. 15.
    Maurice Siteur: Testing with tools-Sleep while you are working. See also http://www.siteur.myweb.nl.
  16. 16.
    J. Tretmans, K. Wijbrans, M. Chaudron: Software Engineering with Formal Methods: The development of a storm surge barrier control system-revisiting seven myths of formal methods, Formal Methods in System Design, 19(2), 195–215, 2001.zbMATHCrossRefGoogle Scholar
  17. 17.
    D. Lee, and M. Yannakakis, M Principles and Methods for Testing Finite State Machines — A Survey, The Proceedings of the IEEE, 84(8), pp 1090–1123, 1996.CrossRefGoogle Scholar
  18. 18.
    L. Feijs, F. Meijs, J. Moonen, J. Wamel Conformance Testing of a Multimedia System Using PHACT in Workshop on Testing of Communicating Systems 11 pp 193–210, 1998.Google Scholar
  19. 19.
    E. Brinksma, J. Tretmans Testing Transition Systems: An Annotated Bibliography”, in Modeling and Verification of Parallel Processes-4 th Summer School MOVEP 2000 LNCS 2067, pp 186–195, 2001.CrossRefGoogle Scholar
  20. 20.
    J. Fernandez, C. Jard, T. Jéron, C. Viho Using On-the-Fly Verification Techniques for the generation of test suites, LNCS 1102, 1996.Google Scholar
  21. 21.
    A. Kerbrat, T. Jéron, R. Groz Automated Test Generation from SDL Specifications, in The Next Millennium-Proceedings of the 9 th SDL Forum, pp 135–152, 1999.Google Scholar
  22. 22.
    J. He, K. Turner Protocol-Inspired Hardware Testing, in Int. Workshop on Testing of Communicating Systems 12 pp 131–147, 1999.Google Scholar
  23. 23.
    A. Belinfante, J. Feenstra, R. Vries, J. Tretmans, N. Goga, L. Feijs, S. Mauw, L. Heerink Formal Test Automation: A Simple Experiment, in Int. Workshop on Testing of Communicating Systems 12 pp 179–196, 1999.Google Scholar
  24. 24.
    J. Tretmans, E. Brinksma Côte de Resyste — Automated Model Based Testing, in Progress 2002 — 3rd Workshop on Embedded Systems, pp 246–255, 2002.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Pieter Koopman
    • 1
  • Artem Alimarine
    • 1
  • Jan Tretmans
    • 1
  • Rinus Plasmeijer
    • 1
  1. 1.Nijmegen Institute for Computer and Information ScienceThe Netherlands

Personalised recommendations