Advertisement

Software & Systems Modeling

, Volume 13, Issue 4, pp 1269–1290 | Cite as

The KlaperSuite framework for model-driven reliability analysis of component-based systems

  • Andrea Ciancone
  • Mauro Luigi Drago
  • Antonio Filieri
  • Vincenzo Grassi
  • Heiko Koziolek
  • Raffaela Mirandola
Theme Section Paper

Abstract

Automatic prediction tools play a key role in enabling the application of non-functional requirements analysis, to simplify the selection and the assembly of components for component-based software systems, and in reducing the need for strong mathematical skills for software designers. By exploiting the paradigm of Model-Driven Engineering (MDE), it is possible to automatically transform design models into analytical models, thus enabling formal property verification. MDE is the core paradigm of the KlaperSuite framework presented in this paper, which exploits the KLAPER pivot language to fill the gap between design and analysis of component-based systems for reliability properties. KlaperSuite is a family of tools empowering designers with the ability to capture and analyze quality of service views of their systems, by building a one-click bridge towards a number of established verification instruments. In this article, we concentrate on the reliability-prediction capabilities of KlaperSuite and we evaluate them with respect to several case studies from literature and industry.

Keywords

Model-driven engineering Reliability analysis and Component-based systems 

References

  1. 1.
    Aleti, A., Bjornander, S., Grunske, L., Meedeniya, I.: Archeopterix: an extendable tool for architecture optimization of aadl models. In: MOMPES. IEEE, New York (2009)Google Scholar
  2. 2.
    Alur, R., Henzinger, T.A.: Reactive modules. Formal Methods Syst. Des. 15(1), 7–48 (1999)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Atkinson, C., Kühne, T.: Model-driven development: a metamodeling foundation. IEEE Softw. 20(5), 36–41 (2003)CrossRefGoogle Scholar
  4. 4.
    Balsamo, S., Di Marco, A., Inverardi, P., Simeoni, M.: Model-based performance prediction in software development: a survey. IEEE Trans. Softw. Eng. 30(5), 295–310 (2004)CrossRefGoogle Scholar
  5. 5.
    Becker, S., Koziolek, H., Reussner, R.: Model-based performance prediction with the palladio component model. In: WOSP, pp. 54–65. ACM, New York (2007)Google Scholar
  6. 6.
    Becker, S., Koziolek, H., Reussner, R.: The Palladio component model for model-driven performance prediction. J. Syst. Softw. 82(1), 3–22 (2009)CrossRefGoogle Scholar
  7. 7.
    Bézivin, J., Pierantonio, A., Vallecillo, A., Gray, J.: Guest editorial to the special section on model transformation. Softw. Syst. Model. 8(3), 303–304 (2009)CrossRefGoogle Scholar
  8. 8.
    Bures, T., Carlson, J., Crnkovic, I., Sentilles, S., Vulgarakis, A.: Procom—the progress component model reference manual, version 1.0. Technical Report MHD-MRTC-230/2008-1-SE, Malardalen University (2008)Google Scholar
  9. 9.
    Canfora, G., Di Penta, M., Esposito, R., Villani, M.L.: An approach for qos-aware service composition based on genetic algorithms. In: GECCO. ACM, New York (2005)Google Scholar
  10. 10.
    Cheung, L., Roshandel, R., Medvidovic, N., Golubchik, L.: Early prediction of software component reliability. In: ICSE, pp. 111–120. ACM, New York (2008)Google Scholar
  11. 11.
    Cheung, R.C.: A user-oriented software reliability model. IEEE Trans. Softw. Eng. 6(2), 118–125 (1980)CrossRefMATHGoogle Scholar
  12. 12.
    Ciancone, A., Filieri, A., Drago, M.L., Mirandola, R., Grassi, V.: Klapersuite: an integrated model-driven environment for reliability and performance analysis of component-based systems. In: Bishop, J., Vallecillo, A. (eds.) Objects, Models, Components, Patterns. Lecture Notes in Computer Science, vol. 6705, pp. 99–114. Springer, Berlin (2011)Google Scholar
  13. 13.
    Cortellessa, V., Martens, A., Reussner, R., Trubiani, C.: A process to effectively identify “guilty” performance antipatterns. In: FASE (2010)Google Scholar
  14. 14.
    Cortellessa, V., Singh, H., Cukic, B.: Early reliability assessment of uml based software models. In: WOSP ’02: Proceedings of the 3rd International Workshop on Software and Performance, pp. 302–309. ACM, New York (2002)Google Scholar
  15. 15.
    Crnkovic, I.: Building Reliable Component-Based Software Systems. Artech House Inc, USA (2002)Google Scholar
  16. 16.
    Czarnecki, K., Helsen, S.: Feature-based survey of model transformation approaches. IBM Syst. J. 45(3), 621–646 (2006)CrossRefGoogle Scholar
  17. 17.
    Drago, M.L., Ghezzi, C., Mirandola, R.: Towards quality driven exploration of model transformation spaces. In: MoDELS, pp. 2–16. ACM, New York (2011)Google Scholar
  18. 18.
    Efftinge, S., Kadura, C.: Openarchitectureware 4.1 xpand language reference (2006)Google Scholar
  19. 19.
    Etessami, K., Yannakakis, M.: Recursive markov chains, stochastic grammars, and monotone systems of nonlinear equations. In: STACS. LNCS, vol. 3404, pp. 340–352. Springer, Berlin (2005)Google Scholar
  20. 20.
    Filieri, A., Ghezzi, C., Tamburrelli, G.: A formal approach to adaptive software: continuous assurance of non-functional requirements. Formal Aspects Comput. 24, 163–186 (2012)Google Scholar
  21. 21.
    France, R.B., Rumpe, B.: Model-driven development of complex software: a research roadmap. In: FOSE, pp. 37–54. IEEE Computer Society, New York (2007)Google Scholar
  22. 22.
    Swapna, S.G.: Architecture-based software reliability analysis: overview and limitations. IEEE Trans. Dependable Secure Comput. 4(1), 32–40 (2007)CrossRefGoogle Scholar
  23. 23.
    Swapna, S.G., Trivedi, K.S.: Reliability prediction and sensitivity analysis based on software architecture. In: Proceedings of the International Symposium on Software Reliability Engineering (ISSRE’02), pp. 64–78. IEEE Computer Society, New York (2002)Google Scholar
  24. 24.
    Goseva-Popstojanova, K., Hamill, M., Perugupalli, R.: Large empirical case study of architecture-based software reliability. In: ISSRE, pp. 43–52. IEEE Computer Society, New York (2005)Google Scholar
  25. 25.
    Goseva-Popstojanova, K., Hassan, A., Guedem, A., Abdelmoez, W., Nassar, D.E.M., Ammar, H., Mili, A.: Architectural-level risk analysis using uml. IEEE Trans. Softw. Eng. 29(10), 946–960 (2003)CrossRefGoogle Scholar
  26. 26.
    Goseva-Popstojanova, K., Trivedi, K.S.: Architecture-based approach to reliability assessment of software systems. Perf. Eval. 45(2–3), 179–204 (2001)Google Scholar
  27. 27.
    Grassi, V., Mirandola, R., Randazzo, E.: Model-driven assessment of qos-aware self-adaptation. In: Software Engineering for Self-Adaptive Systems. LNCS. vol. 5525, pp. 201–222. Springer, Berlin (2009)Google Scholar
  28. 28.
    Grassi, V., Mirandola, R., Randazzo, E., Sabetta, A.: Klaper: An intermediate language for model-driven predictive analysis of performance and reliability. In: CoCoME. LNCS, vol. 5153, pp. 327–356 (2007)Google Scholar
  29. 29.
    Grassi, V., Mirandola, R., Sabetta, A.: Filling the gap between design and performance/reliability models of component-based systems: a model-driven approach. J. Syst. Softw. 80(4), 528–558 (2007)CrossRefGoogle Scholar
  30. 30.
    Object Management Group: Qvt 1.0 specification. http://www.omg.org/spec/QVT/1.0/ (2008)
  31. 31.
    Gu, G.P., Petriu, D.C.: From uml to lqn by xml algebra-based model transformations. In: WOSP, pp. 99–110. ACM, New York (2005)Google Scholar
  32. 32.
    Hinton, A., Kwiatkowska, M.Z., Norman, G., Parker, D.: Prism: a tool for automatic verification of probabilistic systems. In: TACAS. LNCS, vol. 3920, pp. 441–444. Springer, Berlin (2006)Google Scholar
  33. 33.
    Horgan, J.R., Mathur, A.P.: Software testing and reliability. In: Handbook of software reliability engineering, chapter 13. McGraw-Hill, NY (1996)Google Scholar
  34. 34.
    Immonen, A., Niemelä, E.: Survey of reliability and availability prediction methods from the viewpoint of software architecture. Softw. Syst. Model. 7, 49–65 (2008)CrossRefGoogle Scholar
  35. 35.
    Jackson, E.K., Kang, E., Dahlweid, M., Seifert, D., Santen, T.: Components, platforms and possibilities: towards generic automation for MDA. In: EMSOFT. ACM, New York (2010)Google Scholar
  36. 36.
    Jouault, F., Allilaire, F., Bézivin, J., Kurtev, I.: Atl: a model transformation tool. Sci. Comput. Program. 72(1–2), 31–39 (2008)CrossRefMATHGoogle Scholar
  37. 37.
    Koziolek, H.: Performance evaluation of component-based software systems: a survey. Perform. Eval. 67(8), 634–658 (2010)CrossRefGoogle Scholar
  38. 38.
    Koziolek, H., Reussner, R.: A model-transformation from the Palladio component model to layered queueing networks. In: SIPEW. LNCS, vol. 5119, pp. 58–78. Springer, Berlin (2008)Google Scholar
  39. 39.
    Koziolek, H., Schlich, B., Bilich, C.: A large-scale industrial case study on architecture-based software reliability analysis. In: ISSRE, pp. 279–288. IEEE Computer Society, New York (2010)Google Scholar
  40. 40.
    Lazowska, E.D., Zahorjan, J., Graham, G.S., Sevcik, K.C.: Quantitative System Performance: Computer System Analysis Using Queueing Network Models. Prentice Hall, Englewood (1984)Google Scholar
  41. 41.
    Lyu, M.R. (ed.): Handbook of software reliability engineering. McGraw-Hill Inc, Hightstown (1996)Google Scholar
  42. 42.
    Lyu, M.R.: Software reliability engineering: a roadmap. In: FOSE, pp. 153–170 (2007)Google Scholar
  43. 43.
    Martens, A., Koziolek, H., Becker, S., Reussner, R.: Automatically improve software architecture models for performance, reliability, and cost using evolutionary algorithms. In: WOSP/SIPEW, pp. 105–116. ACM Press, New York (2010)Google Scholar
  44. 44.
    McGregor, J.D., Bachmann, F., Bass, L., Bianco, P., Klein, M.: Using arche in the classroom: one experience. Technical Report SEI-2007-TN-001, CMU (2007)Google Scholar
  45. 45.
    Miller, K.W., Morell, L.J., Noonan, R.E., Park, S.K., Nicol, D.M., Murrill, B.W., Voas, J.M.: Estimating the probability of failure when testing reveals no failures. IEEE Trans. Softw. Eng. 18(1), 33–43 (1992)CrossRefGoogle Scholar
  46. 46.
    Mirandola, R., Trubiani, C.: A deep investigation for qos-based feedback at design time and runtime. In: ICECCS, pp. 2–16. IEEE, New York (2011)Google Scholar
  47. 47.
    Neema, S., Sztipanovits, J., Karsai, G., Butts, K.: Constraint-based design-space exploration and model synthesis. In: EMSOFT. Springer, Berlin (2003)Google Scholar
  48. 48.
    Object Management Group (OMG): System modeling language 1.2. http://www.omgsysml.org/ (2010)
  49. 49.
    Object Management Group (OMG): Unified modeling language (uml) 2.3, superstructure. http://www.omg.org/spec/UML/2.3 (2010)
  50. 50.
    Palladio: The software architecture simulator. http://www.palladio-simulator.com/
  51. 51.
    Parsons, T.: A framework for detecting performance design and deployment antipatterns in component based enterprise systems. In: DSM. ACM, New York (2005)Google Scholar
  52. 52.
    Perez-Palacin, D., Mirandola, R., Merseguer, J., Grassi, V.: Qos-based model driven assessment of adaptive reactive systems. In: ICST Workshops, pp. 299–308. IEEE Computer Society, New York (2010)Google Scholar
  53. 53.
    Petri, C.A.: Kommunikation mit Automaten. PhD thesis, University of Bonn (1962)Google Scholar
  54. 54.
    Petriu, D.B., Woodside, C.M.: An intermediate metamodel with scenarios and resources for generating performance models from UML designs. Softw. Syst. Model. 6(2), 163–184 (2007)CrossRefGoogle Scholar
  55. 55.
    Popic, P., Desovski, D., Abdelmoez, W., Cukic, B.: Error propagation in the reliability analysis of component based systems. In: Proceedings of the 16th IEEE International Symposiym on Software Reliability Engineering (ISSRE’05), pp. 53–62. IEEE Computer Society, USA (2005)Google Scholar
  56. 56.
    Q-ImPrESS Consortium: The Q-ImPrESS project. http://www.q-impress.eu (2010)
  57. 57.
    Randazzo, E.: A Model-Based Approach to Performance and Reliability Prediction. PhD thesis, Universitá degli Studi di Roma, Tor Vergata (2010)Google Scholar
  58. 58.
    Rausch, A., Reussner, R., Mirandola, R., Plasil, F. (eds.): The Common Component Modeling Example: Comparing Software Component Models [result from the Dagstuhl research seminar for CoCoME, 1–3 August 2007]. LNCS, vol.5153. Springer, Berlin (2008)Google Scholar
  59. 59.
    Rodrigues, G.N., Rosenblum, D.S., Uchitel, S.: Using scenarios to predict the reliability of concurrent component-based software systems. In: FASE. LNCS, vol. 3442, pp. 111–126. Springer, Berlin (2005)Google Scholar
  60. 60.
    Rodrigues, G.N., Rosenblum, D.S., Wolf, J.: Reliability analysis of concurrent systems using LTSA. In: ICSE, pp. 63–64. IEEE Computer Society, USA (2007)Google Scholar
  61. 61.
    Sato, N., Trivedi, K.S.: Accurate and efficient stochastic reliability analysis of composite services using their compact markov reward model representations. In: Proceedings of the IEEE International Conference on Services Computing (SCC’07), pp. 114–121. IEEE Computer Society, New York (2007)Google Scholar
  62. 62.
    Saxena, T., Karsai, G.: Mde-based approach for generalizing design space exploration. In: MoDELS. Springer, Berlin (2010)Google Scholar
  63. 63.
    Sharma, V.S., Trivedi, K.S.: Quantifying software performance, reliability and security: an architecture-based approach. J. Syst. Softw. 80, 493–509 (2007)CrossRefGoogle Scholar
  64. 64.
    Smith, C.U., Williams, L.G.: Performance and Scalability of Distributed Software Architectures: an SPE Approach. Addison Wesley, Reading (2002)Google Scholar
  65. 65.
    Szyperski, C.: Component Software: Beyond Object-Oriented Programming. Addison-Wesley, Reading (2002)Google Scholar
  66. 66.
    The Eclipse Foundation: Eclipse. http://www.eclipse.org (2010)
  67. 67.
    The openArchitectureware Consortium: openArchitectureware. http://www.openarchitectureware.org/
  68. 68.
    Wang, W.-L., Pan, D., Chen, M.-H.: Architecture-based software reliability modeling. J. Syst. Softw. 79(1), 132–146 (2006)CrossRefGoogle Scholar
  69. 69.
    Woodside, M., Petriu, D.C., Petriu, D.B., Shen, H., Israr, T., Merseguer, J.: Performance by unified model analysis (PUMA). In: WOSP, pp. 1–12. ACM Press, New york (2005)Google Scholar
  70. 70.
    WOSP: Proceedings of the International Workshop on Software and Performance. ACM Press, New York (1998–2010)Google Scholar
  71. 71.
    Wu, X., Woodside, M.: Performance modeling from software components. In: WOSP, pp. 290–301. ACM Press, New York (2004)Google Scholar
  72. 72.
    Xu, J.: Rule-based automatic software performance diagnosis and improvement. In: WOSP. ACM, New York (2008)Google Scholar
  73. 73.
    Yacoub, S.M., Cukic, B., Ammar, H.H.: A scenario-based reliability analysis approach for component-based software. IEEE Trans. Reliab. 53(4), 465–480 (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Andrea Ciancone
    • 1
  • Mauro Luigi Drago
    • 1
  • Antonio Filieri
    • 2
    • 1
  • Vincenzo Grassi
    • 3
  • Heiko Koziolek
    • 4
  • Raffaela Mirandola
    • 1
  1. 1.Politecnico di MilanoMilanItaly
  2. 2.Reliable Software Systems GroupUniversity of StuttgartStuttgartGermany
  3. 3.Universita’ di Roma “Tor Vergata”RomaItaly
  4. 4.Industrial Software SystemsABB Corporate ResearchLadenburgGermany

Personalised recommendations