Advertisement

Exploiting Traceability Uncertainty Between Software Architectural Models and Performance Analysis Results

  • Catia Trubiani
  • Achraf Ghabi
  • Alexander Egyed
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9278)

Abstract

While software architecture performance analysis is a well-studied field, it is less understood how the analysis results (i.e., mean values, variances, and/or probability distributions) trace back to the architectural model elements (i.e., software components, interactions among components, deployment nodes). Yet, understanding this traceability is critical for understanding the analysis result in context of the architecture. The goal of this paper is to automate the traceability between software architectural models and performance analysis results by investigating the uncertainty while bridging these two domains. Our approach makes use of performance antipatterns to deduce the logical consequences between the architectural elements and analysis results and automatically build a graph of traces to identify the most critical causes of performance flaws. We developed a tool that jointly considers SOftware and PErformance concepts (SoPeTraceAnalyzer), and it automatically builds model-to-results traceability links. The benefit of the tool is illustrated by means of a case study in the e-health domain.

Keywords

Traceability Uncertainty Software modelling Performance analysis 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Smith, C.U., Woodside, M.: Performance validation at early stages of software development. In: System Performance Evaluation: Methodologies and Applications. CRC Press (1999)Google Scholar
  2. 2.
    Smith, C.U., Williams, L.G.: Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software. Addison-Wesley (2002)Google Scholar
  3. 3.
    Cortellessa, V., Di Marco, A., Inverardi, P.: Model-Based Software Performance Analysis. Springer (2011)Google Scholar
  4. 4.
    Ghabi, A., Egyed, A.: Exploiting traceability uncertainty between architectural models and code. In: Joint Working IEEE/IFIP WICSA/ECSA. pp. 171–180 (2012)Google Scholar
  5. 5.
    Smith, C.U., Williams, L.G.: More new software antipatterns: even more ways to shoot yourself in the foot. In: International CMG Conference, pp. 717–725 (2003)Google Scholar
  6. 6.
    Cortellessa, V., Di Marco, A., Trubiani, C.: An approach for modeling and detecting software performance antipatterns based on first-order logics. Software and System Modeling 13, 391–432 (2014)CrossRefGoogle Scholar
  7. 7.
    Trubiani, C., Ghabi, A., Egyed, A.: (SoPeTraceAnalyzer). http://www.sea.uni-linz.ac.at/tools/TraZer/SoPeTraceAnalyzer.zip
  8. 8.
    Woodside, C.M., Franks, G., Petriu, D.C.: The future of software performance engineering. In: Workshop on the Future of Software Engineering FOSE, pp. 171–187 (2007)Google Scholar
  9. 9.
    Cortellessa, V., Di Marco, A., Eramo, R., Pierantonio, A., Trubiani, C.: Digging into UML models to remove performance antipatterns. In: ICSE Workshop QUOVADIS, pp. 9–16 (2010)Google Scholar
  10. 10.
    Cortellessa, V., De Sanctis, M., Di Marco, A., Trubiani, C.: Enabling performance antipatterns to arise from an adl-based software architecture. In: Joint Working IEEE/IFIP Conference WICSA/ECSA, pp. 310–314 (2012)Google Scholar
  11. 11.
    Trubiani, C., Koziolek, A., Cortellessa, V., Reussner, R.: Guilt-based handling of software performance antipatterns in palladio architectural models. Journal of Systems and Software 95, 141–165 (2014)CrossRefGoogle Scholar
  12. 12.
    Antoniol, G.: Design-code traceability recovery: selecting the basic linkage properties. Science of Computer Programming 40, 213–234 (2001)CrossRefzbMATHGoogle Scholar
  13. 13.
    Egyed, A., Grunbacher, P.: Automating requirements traceability: beyond the record & replay paradigm. In: International Conference on Automated Software Engineering (ASE), pp. 163–171. IEEE (2002)Google Scholar
  14. 14.
    Cleland-Huang, J., Settimi, R., Romanova, E., Berenbach, B., Clark, S.: Best practices for automated traceability. Computer 40, 27–35 (2007)CrossRefGoogle Scholar
  15. 15.
    Ghabi, A., Egyed, A.: Exploiting traceability uncertainty among artifacts and code. accepted for Journal of Systems and Software (JSS) (to appear, 2014)Google Scholar
  16. 16.
    Fritzsche, M., Johannes, J., Zschaler, S., Zherebtsov, A., Terekhov, E.: Application of tracing techniques in model-driven performance engineering. In: European Conference on Model Driven Architecture - Foundations and Applications (ECMDA-FA) (2008)Google Scholar
  17. 17.
    Petriu, D.B., Amyot, D., Woodside, C.M., Jiang, B.: Traceability and evaluation in scenario analysis by use case maps. In: Leue, S., Systä, T.J. (eds.) Scenarios: Models, Transformations and Tools. LNCS, vol. 3466, pp. 134–151. Springer, Heidelberg (2005) CrossRefGoogle Scholar
  18. 18.
    Alhaj, M., Petriu, D.C.: Traceability links in model transformations between software and performance models. In: Khendek, F., Toeroe, M., Gherbi, A., Reed, R. (eds.) SDL 2013. LNCS, vol. 7916, pp. 203–221. Springer, Heidelberg (2013) CrossRefGoogle Scholar
  19. 19.
    Whittle, J., Sawyer, P., Bencomo, N., Cheng, B.H.C., Bruel, J.: Relax: Incorporating uncertainty into the specification of self-adaptive systems. In: IEEE International Conference on Requirements Engineering, pp. 79–88 (2009)Google Scholar
  20. 20.
    Esfahani, N., Malek, S., Razavi, K.: Guidearch: guiding the exploration of architectural solution space under uncertainty. In: International Conference on Software Engineering (ICSE), pp. 43–52 (2013)Google Scholar
  21. 21.
    Letier, E., Stefan, D., Barr, E.T.: Uncertainty, risk, and information value in software requirements and architecture. In: International Conference on Software Engineering (ICSE), pp. 883–894 (2014)Google Scholar
  22. 22.
    Arcelli, D., Cortellessa, V., Trubiani, C.: Antipattern-based model refactoring for software performance improvement. In: International Conference on Quality of Software Architectures (QoSA), pp. 33–42 (2012)Google Scholar
  23. 23.
    Jain, R.: The Art of Computer Systems Performance Analysis: Techniques for Experimental Design, Measurement, Simulation, and Modeling. SIGMETRICS Performance Evaluation Review 19, 5–11 (1991)CrossRefzbMATHGoogle Scholar
  24. 24.
    Clements, P.C., Garlan, D., Little, R., Nord, R.L., Stafford, J.A.: Documenting software architectures: views and beyond. In: International Conference on Software Engineering (ICSE), pp. 740–741 (2003)Google Scholar
  25. 25.
    Cortellessa, V., Mirandola, R.: Prima-uml: a performance validation incremental methodology on early uml diagrams. Sci. Comput. Program. 44, 101–129 (2002)CrossRefzbMATHGoogle Scholar
  26. 26.
    Casale, G., Serazzi, G.: Quantitative system evaluation with java modeling tools. In: International Conference on Performance Engineering (ICPE), pp. 449–454 (2011)Google Scholar
  27. 27.
    Smith, C.U.: Introduction to software performance engineering: origins and outstanding problems. In: Bernardo, M., Hillston, J. (eds.) SFM 2007. LNCS, vol. 4486, pp. 395–428. Springer, Heidelberg (2007) CrossRefGoogle Scholar
  28. 28.
    Gotel, O., Cleland-Huang, J., Hayes, J.H., Zisman, A., Egyed, A., Grünbacher, P., Antoniol, G.: The quest for ubiquity: a roadmap for software and systems traceability research. In: IEEE International Requirements Engineering Conference (RE), pp. 71–80 (2012)Google Scholar
  29. 29.
    Franks, G., Petriu, D.C., Woodside, C.M., Xu, J., Tregunno, P.: Layered bottlenecks and their mitigation. In: International Conference on the Quantitative Evaluation of Systems (QEST), pp. 103–114 (2006)Google Scholar
  30. 30.
    Aleti, A., Buhnova, B., Grunske, L., Koziolek, A., Meedeniya, I.: Software architecture optimization methods: A systematic literature review. IEEE Trans. Software Eng. 39, 658–683 (2013)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Gran Sasso Science InstituteL’AquilaItaly
  2. 2.Johannes Kepler UniversityLinzAustria

Personalised recommendations