Advertisement

Model-Driven Generation of Performance Prototypes

  • Steffen Becker
  • Tobias Dencker
  • Jens Happe
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5119)

Abstract

Early, model-based performance predictions help to understand the consequences of design decisions on the performance of the resulting system before the system’s implementation becomes available. While this helps reducing the costs for redesigning systems not meeting their extra-functional requirements, performance prediction models have to abstract from the full complexity of modern hard- and software environments potentially leading to imprecise predictions. As a solution, the construction and execution of prototypes on the target execution environment gives early insights in the behaviour of the system under realistic conditions. In literature several approaches exist to generate prototypes from models which either generate code skeletons or require detailed models for the prototype. In this paper, we present an approach which aims at automated generation of a performance prototype based solely on a design model with performance annotations. For the concrete realisation, we used the Palladio Component Model (PCM), which is a component-based architecture modelling language supporting early performance analyses. For a typical three-tier business application, the resulting Java EE code shows how the prototype can be used to evaluate the influence of complex parts of the execution environment like memory interactions or the operating system’s scheduler.

Keywords

Performance Prototyping Model-Driven Software Engineering Palladio Component Model 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Smith, C.U., Williams, L.G.: Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software. Addison-Wesley, Reading (2002)Google Scholar
  2. 2.
    Balsamo, S., Di Marco, A., Inverardi, P., Simeoni, M.: Model-Based Performance Prediction in Software Development: A Survey. IEEE Transactions on Software Engineering 30(5), 295–310 (2004)CrossRefGoogle Scholar
  3. 3.
    Grundy, J., Cai, Y., Liu, A.: Generation of Distributed System Test-beds from High-level Software Architecture Descriptions. In: Proceedings of the 2001 IEEE International Conference on Automated Software Engineering, San Diego, CA (2001)Google Scholar
  4. 4.
    Hu, L., Gorton, I.: A performance prototyping approach to designing concurrent software architectures. In: Proceedings of the 2nd International Workshop on Software Engineering for Parallel and Distributed Systems, pp. 270–276 (1997)Google Scholar
  5. 5.
    Denaro, G., Polini, A., Emmerich, W.: Early performance testing of distributed software applications. In: SIGSOFT Software Engineering Notes, vol. 29, pp. 94–103. ACM Press, New York (2004)Google Scholar
  6. 6.
    Becker, S., Koziolek, H., Reussner, R.: Model-based Performance Prediction with the Palladio Component Model. In: Proceedings of the 6th International Workshop on Software and Performance (WOSP2007), ACM Sigsoft (2007)Google Scholar
  7. 7.
    Reussner, R.H., Becker, S., Koziolek, H., Happe, J., Kuperberg, M., Krogmann, K.: The Palladio Component Model. Interner Bericht 2007-21, Universität Karlsruhe (TH), Faculty for Informatics, Karlsruhe, Germany (2007)Google Scholar
  8. 8.
    Wu, X.: An Approach to Predicting Performance for Component Based Systems. Master’s thesis, Carleton University (2003)Google Scholar
  9. 9.
    Petriu, D.C., Wang, X.: From UML description of high-level software architecture to LQN performance models. In: Münch, M., Nagl, M. (eds.) AGTIVE 1999. LNCS, vol. 1779, pp. 47–63. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  10. 10.
    Cortellessa, V., Di Marco, A., Inverardi, P.: Integrating Performance and Reliability Analysis in a Non-Functional MDA Framework. In: Dwyer, M.B., Lopes, A. (eds.) FASE 2007. LNCS, vol. 4422, pp. 57–71. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  11. 11.
    Bardram, J.E., Christensen, H.B., Corry, A.V., Hansen, K.M., Ingstrup, M.: Exploring quality attributes using architectural prototyping. In: Reussner, R., Mayer, J., Stafford, J.A., Overhage, S., Becker, S., Schroeder, P.J. (eds.) QoSA 2005 and SOQUA 2005. LNCS, vol. 3712, pp. 155–170. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  12. 12.
    Avritzer, A., Weyuker, E.J.: Deriving Workloads for Performance Testing. Software–Practice and Experience 26(6), 613–633 (1996)CrossRefGoogle Scholar
  13. 13.
    Woodside, C.M., Schramm, C.: Scalability and performance experiments using synthetic distributed server systems. Distributed Systems Engineering 3, 2–8 (1996)CrossRefGoogle Scholar
  14. 14.
    Rolia, J.A., Sevcik, K.C.: The Method of Layers. IEEE Transactions on Software Engineering 21(8), 689–700 (1995)CrossRefGoogle Scholar
  15. 15.
    Koziolek, H.: Parameter Dependencies for Reusable Performance Specifications of Software Components. PhD thesis, University of Oldenburg (2008)Google Scholar
  16. 16.
    Zhu, L., Liu, Y., Gorton, I., Bui, N.B.: Customized Benchmark Generation Using MDA. In: WICSA 2005: Proceedings of the 5th Working IEEE/IFIP Conference on Software Architecture, Washington, DC, USA, pp. 35–44. IEEE Computer Society, Los Alamitos (2005)Google Scholar
  17. 17.
    Zhu, L., Gorton, I., Liu, Y., Bui, N.B.: Model Driven Benchmark Generation for Web Services. In: SOSE 2006: Proceedings of the 2006 International Workshop on Service-Oriented Software Engineering, pp. 33–39. ACM, New York (2006)CrossRefGoogle Scholar
  18. 18.
    Cai, Y., Grundy, J., Hosking, J.: Experiences Integrating and Scaling a Performance Test Bed Generator with an Open Source CASE Tool. In: ASE 2004: Proceedings of the 19th IEEE international conference on Automated software engineering, Washington, DC, USA, pp. 36–45. IEEE Computer Society, Los Alamitos (2004)Google Scholar
  19. 19.
    Object Management Group (OMG): UML Profile for Schedulability, Performance and Time (2005)Google Scholar
  20. 20.
    Szyperski, C., Gruntz, D., Murer, S.: Component Software: Beyond Object-Oriented Programming, 2nd edn. ACM Press and Addison-Wesley, New York (2002)Google Scholar
  21. 21.
    Fowler, M.: Inversion of control containers and the dependency injection pattern (2004) (Last retrieved 2008-01-06)Google Scholar
  22. 22.
    Burden, R., Faires, J.: Numerical Analysis. PWS Publishing Co., Boston (1988)Google Scholar
  23. 23.
    Wu, X., Woodside, M.: Performance Modeling from Software Components. SIGSOFT Softw. Eng. Notes 29(1), 290–301 (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Steffen Becker
    • 1
  • Tobias Dencker
    • 2
  • Jens Happe
    • 3
  1. 1.FZI Forschungszentrum Informatik KarlsruheKarlsruheGermany
  2. 2.Chair of Software Desgin and Quality (SDQ)University of Karlsruhe (TH)KarlsruheGermany
  3. 3.Graduate School TrustsoftUniversity of OldenburgOldenburgGermany

Personalised recommendations