Continuous Rearchitecting of QoS Models: Collaborative Analysis for Uncertainty Reduction

  • Catia Trubiani
  • Raffaela Mirandola
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10475)


Architecting high quality software systems is not trivial, in fact to know whether a certain quality attribute has been achieved, it has to be continuously analysed. Reasoning about multiple quality attributes (e.g., performance, availability) of software systems is even more difficult since it is necessary to jointly analyze multiple and heterogeneous Quality-of-Service (QoS) models. The goal of this paper is to investigate the combined use of different QoS models and continuously re-architecting them since the acquired knowledge of a specific QoS model may affect another model, thus to put in place a collaborative analysis process that reduces the overall uncertainty. Starting from an example of interaction among two different QoS models, i.e., a Bayesian Network for availability and a Queueing Network for performance, we demonstrate that the collaborative analysis brings benefits to the overall process since the initial uncertainty is reduced. We identify the join/fork points within the analysis process to bring upfront the quality characteristics of software systems, thus to enable the rearchitecting of systems in case of quality flaws. In this way, the QoS analysis becomes an integrated activity in the whole software development life-cycle and quality characteristics are continuously exposed to system architects.


Continuous rearchitecting Collaborative QoS analysis Uncertainty reduction Bayesian Networks Queueing Networks 


  1. 1.
    Aleti, A., Buhnova, B., Grunske, L., Koziolek, A., Meedeniya, I.: Software architecture optimization methods: A systematic literature review. IEEE Trans. Softw. Eng. 39(5), 658–683 (2013)CrossRefGoogle Scholar
  2. 2.
    Bass, L.: Software Architecture in Practice, 3rd edn. Addison-Wesley Professional, Boston (2012)Google Scholar
  3. 3.
    Broman, D., Lee, E.A., Tripakis, S., Törngren, M.: Viewpoints, formalisms, languages, and tools for cyber-physical systems. In: Proceedings of the International Workshop on Multi-Paradigm Modeling, MPM@MoDELS, pp. 49–54 (2012)Google Scholar
  4. 4.
    Calinescu, R., Grunske, L., Kwiatkowska, M., Mirandola, R., Tamburrelli, G.: Dynamic qos management and optimization in service-based systems. IEEE Trans. Softw. Eng. 37(3), 387–409 (2011)CrossRefGoogle Scholar
  5. 5.
    Dobrica, L., Niemela, E.: A survey on software architecture analysis methods. IEEE Trans. Softw. Eng. 28(7), 638–653 (2002)CrossRefzbMATHGoogle Scholar
  6. 6.
    Esfahani, N., Malek, S., Razavi, K.: Guidearch: guiding the exploration of architectural solution space under uncertainty. In: Proceedings of International Conference on Software Engineering, ICSE, pp. 43–52 (2013)Google Scholar
  7. 7.
    Etxeberria, L., Trubiani, C., Cortellessa, V., Sagardui, G.: Performance-based selection of software and hardware features under parameter uncertainty. In: International Conference on Quality of Software Architectures, QoSA, pp. 23–32 (2014)Google Scholar
  8. 8.
    Holmes, B., Nicolaescu, A.: Continuous architecting: Just another buzzword? Full-scale Software Engineering/The Art of Software Testing, p. 1 (2017)Google Scholar
  9. 9.
    Jensen, F.V.: An Introduction to Bayesian Networks. UCL press, London (1996)Google Scholar
  10. 10.
    Kleinrock, L.: Queueing Systems: Theory. Wiley, New York (1975)zbMATHGoogle Scholar
  11. 11.
    Koziolek, A., Koziolek, H., Reussner, R.H.: Peropteryx: automated application of tactics in multi-objective software architecture optimization. In: International Conference on the Quality of Software Architectures, QoSA, pp. 33–42 (2011)Google Scholar
  12. 12.
    Meedeniya, I., Aleti, A., Grunske, L.: Architecture-driven reliability optimization with uncertain model parameters. JSS 85(10), 2340–2355 (2012)Google Scholar
  13. 13.
    Menascé, D.A., Gomaa, H., Malek, S., Sousa, J.P.: SASSY: A framework for self-architecting service-oriented systems. IEEE Softw. 28(6), 78–85 (2011)CrossRefGoogle Scholar
  14. 14.
    Perez-Palacin, D., Mirandola, R.: Uncertainties in the modeling of self-adaptive systems: a taxonomy and an example of availability evaluation. In: International Conference on Performance Engineering, ICPE, pp. 3–14 (2014)Google Scholar
  15. 15.
    Rajhans, A., Bhave, A., Ruchkin, I., Krogh, B.H., Garlan, D., Platzer, A., Schmerl, B.R.: Supporting heterogeneity in cyber-physical systems architectures. IEEE Trans. Automat. Contr. 59(12), 3178–3193 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Sarkar, A., Guo, J., Siegmund, N., Apel, S., Czarnecki, K.: Cost-efficient sampling for performance prediction of configurable systems. In: International Conference on Automated Software Engineering, ASE (2015)Google Scholar
  17. 17.
    Shahin, M., Babar, M.A., Zhu, L.: The intersection of continuous deployment and architecting process: practitioners’ perspectives. In: ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (2016)Google Scholar
  18. 18.
    Slootweg, J., De Haan, S., Polinder, H., Kling, W.: General model for representing variable speed wind turbines in power system dynamics simulations. IEEE Trans. Power Syst. 18(1), 144–151 (2003)CrossRefGoogle Scholar
  19. 19.
    Stecklein, J.M., Dabney, J., Dick, B., Haskins, B., Lovell, R., Moroney, G.: Error cost escalation through the project life cycle. NASA Technical report (2004)Google Scholar
  20. 20.
    Trubiani, C., Koziolek, A., Cortellessa, V., Reussner, R.H.: Guilt-based handling of software performance antipatterns in palladio architectural models. J. Syst. Softw. 95, 141–165 (2014)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Gran Sasso Science InstituteL’AquilaItaly
  2. 2.Politecnico di MilanoMilanoItaly

Personalised recommendations