Advertisement

A QoS Test-Bed Generator for Web Services

  • Antonia Bertolino
  • Guglielmo De Angelis
  • Andrea Polini
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4607)

Abstract

In the last years both industry and academia have shown a great interest in ensuring consistent cooperation for business-critical services, with contractually agreed levels of Quality of Service. Service Level Agreement specifications as well as techniques for their evaluation are nowadays irremissible assets. This paper presents Puppet (Pick UP Performance Evaluation Test-bed), an approach and a tool for the automatic generation of test-beds to empirically evaluate the QoS features of a Web Service under development. Specifically, the generation exploits the information about the coordinating scenario (be it choreography or orchestration), the service description (WSDL) and the specification of the agreements (WS-Agreement).

Keywords

Service Composition Service Level Agreement Service Level Objective Guarantee Term Global Grid Forum 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Alonso, G., Casati, F., Kuno, H., Machiraju, V.: Web Services–Concepts, Architectures and Applications. Springer–Verlag, Heidelberg (2004)zbMATHGoogle Scholar
  2. 2.
    Apache Software Foundation. Axis User’s Guide. http://ws.apache.org/axis/java/user-guide.html.
  3. 3.
    Baresi, L., Ghezzi, C., Guinea, S.: Smart Monitors for Composed Services. In: ICSOC 2004. Proc. 2nd Int. Conf. on Service Oriented Computing, pp. 193–202. ACM Press, New York (2004)CrossRefGoogle Scholar
  4. 4.
    Bertolino, A., Bonivento, A., De Angelis, G., Sangiovanni Vincentelli, A.: Modeling and Early Performance Estimation for Network Processor Applications. In: Nierstrasz, O., Whittle, J., Harel, D., Reggio, G. (eds.) MoDELS 2006. LNCS, vol. 4199, Springer, Heidelberg (2006)CrossRefGoogle Scholar
  5. 5.
    Bertolino, A., Frantzen, L., Polini, A., Tretmans, J.: Audition of Web Services for Testing Conformance to Open Specified Protocols. In: Reussner, R., Stafford, J.A., Szyperski, C.A. (eds.) Architecting Systems with Trustworthy Components. LNCS, vol. 3938, Springer, Heidelberg (2006)CrossRefGoogle Scholar
  6. 6.
    Bertolino, A., Mirandola, R.: Software Performance Engineering of Component–Based Systems. In: WOSP 2004, pp. 238–242. ACM Press, New York (2004)CrossRefGoogle Scholar
  7. 7.
    Denaro, G., Polini, A., Emmerich, W.: Early Performance Testing of Distributed Software Applications. In: WOSP 2004, pp. 94–103. ACM Press, New York (2004)CrossRefGoogle Scholar
  8. 8.
    Draheim, D., Grundy, J., Hosking, J., Lutteroth, C., Weber, G.: Realistic Load Testing of Web Applications. In: Proc. Conf. on Software Maintenance and Reengineering, pp. 57–70. IEEE Computer Society Press, Los Alamitos (2006)CrossRefGoogle Scholar
  9. 9.
    Weyuker, E., Vokolos, F.: Experience with performance testing of software systems: Issues, and approach, and case study. IEEE Transaction on Software Engneering 26(12), 1147–1156 (2000)CrossRefGoogle Scholar
  10. 10.
    Global Grid Forum: Web Services Agreement Specification (WS–Agreement), version 2005/09 (edn.) (September 2005)Google Scholar
  11. 11.
    Grundy, J., Hosking, J., Li, L., Liu, N.: Performance Engineering of Service Compositions. In: SOSE 2006. Proc. Int. Workshop on Service–Oriented Software Engineering, pp. 26–32. ACM Press, New York (2006)CrossRefGoogle Scholar
  12. 12.
    Heckel, R., Lohmann, M.: Towards Contract–based Testing of Web Services. Electronic Notes in Theoretical Computer Science, vol. 116, pp. 145–156 (2005)Google Scholar
  13. 13.
    Hrischuk, C.E., Rolia, J.A., Woodside, C.M.: Automatic Generation of a Software Performance Model Using an Object-Oriented Prototype. In: MASCOTS 1995. Proc. 3rd Int. Workshop on Modeling, Analysis, and Simulation On Computer and Telecommunication Systems, pp. 399–409. IEEE Computer Society Press, Los Alamitos (1995)Google Scholar
  14. 14.
    IBM. WSLA: Web Service Level Agreements, version: 1.0 revision: wsla-, 2003/01/28 edn. (2003)Google Scholar
  15. 15.
    Liu, Y., Gorton, I.: Accuracy of Performance Prediction for EJB Applications: A Statistical Analysis. In: Gschwind, T., Mascolo, C. (eds.) SEM 2004. LNCS, vol. 3437, pp. 185–198. Springer, Heidelberg (2005)Google Scholar
  16. 16.
    Liu, Y., Gorton, I., Liu, A., Jiang, N., Chen, S.: Designing a test suite for empirically-based middleware performance prediction. In: CRPIT ’02: Proc. 4th Int. Conf. on Tools Pacific, pp. 123–130. ACS (2002)Google Scholar
  17. 17.
    Ludwig, H.: WS-Agreement Concepts and Use – Agreement-Based Service-Oriented Architectures. Technical report, IBM (May 2006)Google Scholar
  18. 18.
    OASIS. Web Services Business Process Execution Language (WSBPEL) 2.0 (December 2005) http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=wsbpel.
  19. 19.
    Skene, J., Lamanna, D.D., Emmerich, W.: Precise Service Level Agreements. In: Proc. 26th Int. Conf. on Software Engineering (ICSE 2004), pp. 179–188 (2004)Google Scholar
  20. 20.
    Smith, C.U., Williams, L.: Performance Solutions: A practical Guide To Creating Responsive, Scalable Software. Addison–Wesley, London, UK (2001)Google Scholar
  21. 21.
    Tkachuk, O., Rajan, S.P.: Application of Automated Environment Generation to Commercial Software. In: ISSTA 2006. Proc. ACM Int. Symp. on Sw Testing and Analysis, pp. 203–214. ACM Press, New York (2006)CrossRefGoogle Scholar
  22. 22.
    W3C. Web Services Choreography Description Language (WS–CDL) 1.0 (November 2005) http://www.w3.org/TR/ws-cdl-10/

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Antonia Bertolino
    • 1
  • Guglielmo De Angelis
    • 1
  • Andrea Polini
    • 1
  1. 1.Istituto di Scienza e Tecnologie dell’Informazione - CNR, Via Moruzzi 1, 56124 PisaItaly

Personalised recommendations