An Extensible Monitoring Framework for Measuring and Evaluating Tool Performance in a Service-Oriented Architecture

  • Christoph Becker
  • Hannes Kulovits
  • Michael Kraxner
  • Riccardo Gottardi
  • Andreas Rauber
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5648)


The lack of QoS attributes and their values is still one of the fundamental drawbacks of web service technology. Most approaches for modelling and monitoring QoS and web service performance focus either on client-side measurement and feedback of QoS attributes, or on ranking and discovery, developing extensions of the standard web service discovery models. However, in many cases, provider-side measurement can be of great additional value to aid the evaluation and selection of services and underlying implementations.

We present a generic architecture and reference implementation for non-invasive provider-side instrumentation of data-processing tools exposed as QoS-aware web services, where real-time quality information is obtained through an extensible monitoring framework. In this architecture, dynamically configurable execution engines measure QoS attributes and instrument the corresponding web services on the provider side. We demonstrate the application of this framework to the task of performance monitoring of a variety of applications on different platforms, thus enriching the services with real-time QoS information, which is accumulated in an experience base.


Service Selection Service Execution Digital Preservation Conversion Tool Monitoring Engine 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Becker, C., Ferreira, M., Kraxner, M., Rauber, A., Baptista, A.A., Ramalho, J.C.: Distributed preservation services: Integrating planning and actions. In: Christensen-Dalsgaard, B., Castelli, D., Ammitzbøll Jurik, B., Lippincott, J. (eds.) ECDL 2008. LNCS, vol. 5173, pp. 25–36. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  2. 2.
    Becker, C., Rauber, A.: Requirements modelling and evaluation for digital preservation: A COTS selection method based on controlled experimentation. In: Proc. 24th ACM Symposium on Applied Computing (SAC 2009), Honolulu, Hawaii, USA. ACM Press, New York (2009)Google Scholar
  3. 3.
    Becker, C., Rauber, A., Heydegger, V., Schnasse, J., Thaller, M.: A generic XML language for characterising objects to support digital preservation. In: Proc. 23rd ACM Symposium on Applied Computing (SAC 2008), Fortaleza, Brazil, vol. 1, pp. 402–406. ACM Press, New York (2008)CrossRefGoogle Scholar
  4. 4.
    Carvallo, J.P., Franch, X., Quer, C.: Determining criteria for selecting software components: Lessons learned. IEEE Software 24(3), 84–94 (2007)CrossRefGoogle Scholar
  5. 5.
    Cechich, A., Piattini, M., Vallecillo, A. (eds.): Component-Based Software Quality. Springer, Heidelberg (2003)zbMATHGoogle Scholar
  6. 6.
    Dustdar, S., Schreiner, W.: A survey on web services composition. International Journal of Web and Grid Services 1, 1–30 (2005)CrossRefGoogle Scholar
  7. 7.
    Erradi, A., Maheshwari, P., Tosic, V.: Ws-policy based monitoring of composite web services. In: ECOWS 2007: Proceedings of the Fifth European Conference on Web Services, Washington, DC, USA, pp. 99–108. IEEE Computer Society, Los Alamitos (2007)Google Scholar
  8. 8.
    Ferreira, M., Baptista, A.A., Ramalho, J.C.: An intelligent decision support system for digital preservation. International Journal on Digital Libraries 6(4), 295–304 (2007)CrossRefGoogle Scholar
  9. 9.
    Franch, X., Carvallo, J.P.: Using quality models in software package selection. IEEE Software 20(1), 34–41 (2003)CrossRefGoogle Scholar
  10. 10.
    Head, M.R., Govindaraju, M., Slominski, A., Liu, P., Abu-Ghazaleh, N., van Engelen, R., Chiu, K., Lewis, M.J.: A benchmark suite for soap-based communication in grid web services. In: Proceedings of the ACM/IEEE SC 2005 Conference Supercomputing, 2005, p. 19 (November 2005)Google Scholar
  11. 11.
    Her, J.S., Choi, S.W., Oh, S.H., Kim, S.D.: A framework for measuring performance in service-oriented architecture. In: International Conference on Next Generation Web Services Practices, pp. 55–60. IEEE Computer Society, Los Alamitos (2007)CrossRefGoogle Scholar
  12. 12.
    Hunter, J., Choudhury, S.: PANIC - an integrated approach to the preservation of complex digital objects using semantic web services. International Journal on Digital Libraries: Special Issue on Complex Digital Objects 6(2), 174–183 (2006)CrossRefGoogle Scholar
  13. 13.
    ISO: Software Engineering – Product Quality – Part 1: Quality Model (ISO/IEC 9126-1). International Standards Organization (2001)Google Scholar
  14. 14.
    Keller, A., Ludwig, H.: WSLA framework: Specifying and monitoring service level agreements for web services. Journal of Network and Systems Management 11(1), 57–81 (2003)CrossRefGoogle Scholar
  15. 15.
    Larus, J.R., Ball, T.: Rewriting executable files to measure program behavior. Software: Practice and Experience 24(2), 197–218 (1994)Google Scholar
  16. 16.
    Liu, Y., Ngu, A.H., Zeng, L.Z.: Qos computation and policing in dynamic web service selection. In: WWW Alt. 2004: Proceedings of the 13th international World Wide Web conference on Alternate track papers & posters, pp. 66–73. ACM, New York (2004)CrossRefGoogle Scholar
  17. 17.
    Maximilien, E.M., Munindar, P.: Toward autonomic web services trust and selection. In: ICSOC 2004: Proceedings of the 2nd international conference on Service oriented computing, pp. 212–221. ACM, New York (2004)Google Scholar
  18. 18.
    Menascé, D.A.: Qos issues in web services. IEEE Internet Computing 6(6), 72–75 (2002)CrossRefGoogle Scholar
  19. 19.
    Nethercote, N., Seward, J.: Valgrind: a framework for heavyweight dynamic binary instrumentation. SIGPLAN Not. 42(6), 89–100 (2007)CrossRefGoogle Scholar
  20. 20.
    Platzer, C., Rosenberg, F., Dustdar, S.: Enhancing Web Service Discovery and Monitoring with Quality of Service Information. In: Securing Web Services: Practical Usage of Standards and Specifications, Idea Publishing Inc. (2007)Google Scholar
  21. 21.
    Ran, S.: A model for web services discovery with qos. SIGecom Exch. 4(1), 1–10 (2003)CrossRefGoogle Scholar
  22. 22.
    Rosenberg, F., Platzer, C., Dustdar, S.: Bootstrapping performance and dependability attributes of web services. In: International Conference on Web Services (ICWS 2006), pp. 205–212 (2006)Google Scholar
  23. 23.
    Saddik, A.E.: Performance measurements of web services-based applications. IEEE Transactions on Instrumentation and Measurement 55(5), 1599–1605 (2006)CrossRefGoogle Scholar
  24. 24.
    Song, H.G., Lee, K.: Performance Analysis and Estimation Tool of Web Services. In: Business Process Management, sPAC (Web Services Performance Analysis Center). LNCS, vol. 3649, pp. 109–119. Springer, Heidelberg (2005)Google Scholar
  25. 25.
    Tian, M., Gramm, A., Ritter, H., Schiller, J.: Efficient selection and monitoring of qos-aware web services with the ws-qos framework. In: Proceedings. IEEE/WIC/ACM International Conference on Web Intelligence, WI 2004, pp. 152–158 (September 2004)Google Scholar
  26. 26.
    Wickramage, N., Weerawarana, S.: A benchmark for web service frameworks. In: 2005 IEEE International Conference on Services Computing, July 2005, vol. 1, pp. 233–240 (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Christoph Becker
    • 1
  • Hannes Kulovits
    • 1
  • Michael Kraxner
    • 1
  • Riccardo Gottardi
    • 1
  • Andreas Rauber
    • 1
  1. 1.Vienna University of TechnologyViennaAustria

Personalised recommendations