Model-Based Generation of Testbeds for Web Services

  • Antonia Bertolino
  • Guglielmo De Angelis
  • Lars Frantzen
  • Andrea Polini
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5047)


A Web Service is commonly not an independent software entity, but plays a role in some business process. Hence, it depends on the services provided by external Web Services, to provide its own service. While developing and testing a Web Service, such external services are not always available, or their usage comes along with unwanted side effects like, e.g., utilization fees or database modifications. We present a model-based approach to generate stubs for Web Services which respect both an extra-functional contract expressed via a Service Level Agreement (SLA), and a functional contract modeled via a state machine. These stubs allow a developer to set up a testbed over the target platform, in which the extra-functional and functional behavior of a Web Service under development can be tested before its publication.


  1. 1.
    Alonso, G., Casati, F., Kuno, H., Machiraju, V.: Web Services–Concepts, Architectures and Applications. Springer, Heidelberg (2004)zbMATHGoogle Scholar
  2. 2.
    Alur, R.: Timed Automata. In: Computer Aided Verification, pp. 8–22 (1999)Google Scholar
  3. 3.
    Jeelani Basha, S., Irani, R.: AXIS: the next generation of Java SOAP. Wrox Press (2002)Google Scholar
  4. 4.
    Bertolino, A., De Angelis, G., Polini, A.: A QoS Test-bed Generator for Web Services. In: Baresi, L., Fraternali, P., Houben, G.-J. (eds.) ICWE 2007. LNCS, vol. 4607, Springer, Heidelberg (2007)CrossRefGoogle Scholar
  5. 5.
    Briones, L.B., Brinksma, E.: A Test Generation Framework for quiescent Real-Time Systems. In: Grabowski, J., Nielsen, B. (eds.) FATES 2004. LNCS, vol. 3395, Springer, Heidelberg (2005)CrossRefGoogle Scholar
  6. 6.
    Christensen, E., et al.: Web Service Definition Language (WSDL) ver. 1.1 (2001),
  7. 7.
    Clarke, D., Jéron, T., Rusu, V., Zinovieva, E.: STG: a Symbolic Test Generation tool. In: Katoen, J.-P., Stevens, P. (eds.) ETAPS 2002 and TACAS 2002. LNCS, vol. 2280, Springer, Heidelberg (2002)CrossRefGoogle Scholar
  8. 8.
    Denaro, G., Polini, A., Emmerich, W.: Early Performance Testing of Distributed Software Applications. In: Proc. of WOSP 2004, pp. 94–103. ACM Press, New York (2004)Google Scholar
  9. 9.
    PLASTIC european project homepage,
  10. 10.
    Weyuker, E., Vokolos, F.: Experience with Performance Testing of Software Systems: Issues, and Approach, and Case Study. IEEE Transaction on Software Engneering 26(12), 1147–1156 (2000)CrossRefGoogle Scholar
  11. 11.
    Frantzen, L., Tretmans, J.: Model-Based Testing of Environmental Conformance of Components. In: de Boer, F.S., Bonsangue, M.M., Graf, S., de Roever, W.P. (eds.) Proc. of FMCO 2006. LNCS, vol. 4709, pp. 1–25. Springer, Heidelberg (2007)Google Scholar
  12. 12.
    Frantzen, L., Tretmans, J., Willemse, T.A.C.: A Symbolic Framework for Model-Based Testing. In: Havelund, K., Núñez, M., Roşu, G., Wolff, B. (eds.) FATES 2006 and RV 2006. LNCS, vol. 4262, pp. 40–54. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  13. 13.
    Global Grid Forum. Web Services Agreement Specification (WS–Agreement), version 2005/09 edition (September 2005)Google Scholar
  14. 14.
    Grabowski, J., Walker, T.: Testing Quality-of-Service Aspects in Multimedia Applications. In: Proc. of PROMS 1995 (1995)Google Scholar
  15. 15.
    Grundy, J., Hosking, J., Li, L., Liu, N.: Performance Engineering of Service Compositions. In: Proc. of IW-SOSE 2006, pp. 26–32. ACM Press, New York (2006)Google Scholar
  16. 16.
    Haverkort, B.R., Niemegeers, I.G.: Performability modelling tools and techniques. Performance Evaluation 25(1), 17–40 (1996)CrossRefzbMATHGoogle Scholar
  17. 17.
    Huhns, M.N., Singh, M.P.: Service-Oriented Computing: Key Concepts and Principles. IEEE Internet Computing 9(1), 75–81 (2005)CrossRefGoogle Scholar
  18. 18.
    Object Management Group. UML 2.0 Superstructure Specification, ptc/03-08-02 edition. Adopted SpecificationGoogle Scholar
  19. 19.
    Ramsokul, P., Sowmya, A., Ramesh, S.: A Test Bed for Web Services Protocols. In: Proc. of ICIW 2007, pp. 16–21 (2007)Google Scholar
  20. 20.
    Sahner, R.A., Trivedi, K.S., Puliafito, A.: Performance and Reliability Analysis of Computer Systems An Example-Based Approach Using the SHARPE Software Package. Kluwer Academic Publishers, Dordrecht (1995)zbMATHGoogle Scholar
  21. 21.
    Skene, J., Lamanna, D.D., Emmerich, W.: Precise Service Level Agreements. In: Proc. of ICSE 2004, pp. 179–188. IEEE Computer Society Press, Los Alamitos (2004)Google Scholar
  22. 22.
    Skene, J., Skene, A., Crampton, J., Emmerich, W.: The Monitorability of Service-Level Agreements for Application-Service Provision. In: Proc. of WOSP 2007, pp. 3–14. ACM Press, New York (2007)Google Scholar
  23. 23.
  24. 24.
    Tretmans, J.: Test generation with inputs, outputs and repetitive quiescence. Software—Concepts and Tools 17(3), 103–120 (1996)zbMATHGoogle Scholar
  25. 25.
    Wang, Y., Rutherford, M.J., Carzaniga, A., Wolf, A.L.: Automating Experimentation on Distributed Testbeds. In: ASE 2005, pp. 164–173. ACM, New York (2005)Google Scholar
  26. 26.
    Zhu, L., Gorton, I., Liu, Y., Bui, N.B.: Model Driven Benchmark Generation for Web Services. In: Proc. IW-SOSE 2006, pp. 33–39. ACM Press, New York (2006)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2008

Authors and Affiliations

  • Antonia Bertolino
    • 1
  • Guglielmo De Angelis
    • 1
  • Lars Frantzen
    • 1
    • 2
  • Andrea Polini
    • 1
    • 3
  1. 1.Istituto di Scienza e Tecnologie della Informazione “Alessandro Faedo”, Consiglio Nazionale delle RicerchePisaItaly
  2. 2.Institute for Computing and Information Sciences (ICIS)Radboud University NijmegenThe Netherlands
  3. 3.Department of Mathematics and Computer ScienceUniversity of CamerinoItaly

Personalised recommendations