Advertisement

Automatic Conformance Testing of Web Services

  • Reiko Heckel
  • Leonardo Mariani
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3442)

Abstract

Web Services are the basic building blocks of next generation Internet applications, based on dynamic service discovery and composition. Dedicated discovery services will store both syntactic and behavioral descriptions of available services and guarantee their compatibility with the requirements expressed by clients. In practice, however, interactions may still fail because the Web Service’s implementation may be faulty. In fact, the client has no guarantee on the quality of the implementation associated to any service description.

In this paper, we propose the idea of high-quality service discovery incorporating automatic testing for validating Web Services before allowing their registration. First, the discovery service automatically generates conformance test cases from the provided service description, then runs the test cases on the target Web Service, and only if the test is successfully passed, the service is registered.

In this way, clients bind with Web Services providing a compatible signature, a suitable behavior, and a high-quality implementation.

Keywords

Discovery Service Graph Transformation Generate Test Case Concrete State Input Domain 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Berners-Lee, T., Hendler, J., Lassila, O.: The Semantic Web. Scientific American 284, 34–43 (2001)CrossRefGoogle Scholar
  2. 2.
    W3C Web Services Architecture Working Group: Web Services architecture requirements. W3C working draft, World Wide Web Consortium (2002)Google Scholar
  3. 3.
    Booth, D., Haas, H., McCabe, F., Newcomer, E., Champion, M., Ferris, C., Orchard, D.: Web Services architecture. In: W3C working group note, W3C (2004)Google Scholar
  4. 4.
    Hausmann, J.H., Heckel, R., Lohmann, M.: Model-based discovery of Web Services. In: Intl. Conference on Web Services (2004)Google Scholar
  5. 5.
    Baresi, L., Heckel, R.: Tutorial introduction to graph transformation: a software engineering perspective. In: Corradini, A., Ehrig, H., Kreowski, H.-J., Rozenberg, G. (eds.) ICGT 2002. LNCS, vol. 2505. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  6. 6.
    Jacobson, I., Booch, G., Rumbaugh, J.: The Unified Software Development Process. Addison-Wesley, Reading (1999)Google Scholar
  7. 7.
    Kaplan, S., Loyall, J., Goering, S.: Specifying concurrent languages and systems with δ-grammars. In: Ehrig, H., Kreowski, H.-J., Rozenberg, G. (eds.) Graph Grammars 1990. LNCS, vol. 532, pp. 475–489. Springer, Heidelberg (1991)CrossRefGoogle Scholar
  8. 8.
    de Roever, W.P., Engelhardt, K.: Data Refinement: Model-Oriented Proof Methods and Their Comparison. Cambridge Tracts in Theoretical Computer Science, vol. 47. Cambridge University Press, Cambridge (1998)zbMATHCrossRefGoogle Scholar
  9. 9.
    White, L., Cohen, E.J.: A domain strategy for computer program testing. IEEE Transactions on Software Engineering 6, 247–257 (1980)CrossRefGoogle Scholar
  10. 10.
    Weyuker, E., Jeng, B.: Analyzing partition testing strategies. IEEE Transactions on Software Engineering 17, 703–711 (1991)CrossRefGoogle Scholar
  11. 11.
    Jeng, B., Weyuker, E.J.: A simplified domain-testing strategy. ACM Transactions on Software Engineering Methodology 3, 254–270 (1994)CrossRefGoogle Scholar
  12. 12.
    Rapps, S., Wejuker, E.: Data flow analysis techniques for program test data selection. In: 6th Intl. Conference on Software Engineering, pp. 272–278 (1982)Google Scholar
  13. 13.
    Frankl, P.G., Weyuker, E.J.: An applicable family of data flow testing criteria. IEEE Transactions on Software Engineering 14, 1483–1498 (1988)CrossRefMathSciNetGoogle Scholar
  14. 14.
    Pande, H.D., Landi, W.A., Ryder, B.G.: Interprocedural def-use associations for C systems with single level pointers. IEEE Transactions on Software Engineering 20, 385–403 (1994)zbMATHCrossRefGoogle Scholar
  15. 15.
    Hausmann, J., Heckel, R., Taentzer, G.: Detecting conflicting functional requirements in a use case driven approach: a static analysis technique based on graph transformation. In: Intl. Conference on Software Engineering, pp. 105–155 (2002)Google Scholar
  16. 16.
    Rozenberg, G. (ed.): Handbook of Graph Grammars and Computing by Graph Transformation, vol. 1 - Foundations. World Scientific, Singapore (1997)Google Scholar
  17. 17.
    Technical University Berlin: The attributed graph grammar system (AGG) (Visited in 2004), http://tfs.cs.tu-berlin.de/agg/
  18. 18.
    Baldan, P., König, B., Stürmer, I.: Generating test cases for code generators by unfolding graph transformation systems. In: Proc. 2nd Intl. Conference on Graph Transformation, Rome, Italy (2004)Google Scholar
  19. 19.
    Bochmann, G.V., Petrenko, A.: Protocol testing: review of methods and relevance for software testing. In: Proceedings of the 1994 ACM SIGSOFT Intl. Symposium on Software Testing and Analysis, pp. 109–124. ACM Press, New York (1994)CrossRefGoogle Scholar
  20. 20.
    Schürr, A., Winter, A.J., Zündorf, A.: The PROGRES approach: language and environment. In: Handbook of graph grammars and computing by graph transformation: applications, languages, and tools, vol. 2, pp. 487–550. World Scientific, Singapore (1999)Google Scholar
  21. 21.
    Rensink, A.: The GROOVE simulator: A tool for state space generation. In: Pfaltz, J.L., Nagl, M., Böhlen, B. (eds.) AGTIVE 2003. LNCS, vol. 3062, pp. 479–485. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  22. 22.
    Singh, I., Stearns, B., Johnson, M.: Enterprise Team: Designing Enterprise Applications with the J2EE Platform, 2nd edn. Addison-Wesley, Reading (2002)Google Scholar
  23. 23.
    Fraikin, F., Leonhardt, T.: SeDiTeC - testing based on sequence diagram. In: IEEE Intl. Conference on Automated Software Engineering (2002)Google Scholar
  24. 24.
    Hartmann, J., Imoberdorf, C., Meisinger, M.: Uml-based integration testing. In: Intl. Symposium on Software Testing and Analysis, pp. 60–70. ACM Press, New York (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Reiko Heckel
    • 1
  • Leonardo Mariani
    • 1
  1. 1.Department of Mathematics and Computer ScienceUniversity of PaderbornPaderbornGermany

Personalised recommendations