Software Performance Antipatterns: Modeling and Analysis

  • Vittorio Cortellessa
  • Antinisca Di Marco
  • Catia Trubiani
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7320)


The problem of capturing performance problems is critical in the software design, mostly because the results of performance analysis (i.e. mean values, variances, and probability distributions) are difficult to be interpreted for providing feedback to software designers. Support to the interpretation of performance analysis results that helps to fill the gap between numbers and design alternatives is still lacking. The aim of this chapter is to present the work that has been done in the last few years on filling such gap. The work is centered on software performance antipatterns, that are recurring solutions to common mistakes (i.e. bad practices) affecting performance. Such antipatterns can play a key role in the software performance domain, since they can be used in the investigation of performance problems as well as in the formulation of solutions in terms of design alternatives.


Software Architecture Performance Evaluation Antipatterns Feedback Generation Design Alternatives 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Smith, C.U., Millsap, C.V.: Software performance engineering for oracle applications: Measurements and models. In: Int. CMG Conference, Computer Measurement Group, pp. 331–342 (2008)Google Scholar
  2. 2.
    Williams, L.G., Smith, C.U.: Software performance engineering: A tutorial introduction. In: Int. CMG Conference, Computer Measurement Group, pp. 387–398 (2007)Google Scholar
  3. 3.
    Smith, C.U.: Introduction to Software Performance Engineering: Origins and Outstanding Problems. In: Bernardo, M., Hillston, J. (eds.) SFM 2007. LNCS, vol. 4486, pp. 395–428. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  4. 4.
    Mens, T., Tourwé, T.: A survey of software refactoring. IEEE Trans. Software Eng. 30, 126–139 (2004)CrossRefGoogle Scholar
  5. 5.
    Woodside, C.M., Franks, G., Petriu, D.C.: The Future of Software Performance Engineering. In: Briand, L.C., Wolf, A.L. (eds.) FOSE, pp. 171–187 (2007)Google Scholar
  6. 6.
    Balsamo, S., Di Marco, A., Inverardi, P., Simeoni, M.: Model-Based Performance Prediction in Software Development: A Survey. IEEE Trans. Software Eng. 30, 295–310 (2004)CrossRefGoogle Scholar
  7. 7.
    Cortellessa, V., Di Marco, A., Inverardi, P.: Model-Based Software Performance Analysis. Springer (2011)Google Scholar
  8. 8.
    Koziolek, H.: Performance evaluation of component-based software systems: A survey. Perform. Eval. 67, 634–658 (2010)CrossRefGoogle Scholar
  9. 9.
    Woodside, C.M., Petriu, D.C., Petriu, D.B., Shen, H., Israr, T., Merseguer, J.: Performance by unified model analysis (PUMA). In: WOSP, pp. 1–12. ACM (2005)Google Scholar
  10. 10.
    Smith, C.U., Williams, L.G.: More New Software Antipatterns: Even More Ways to Shoot Yourself in the Foot. In: International Computer Measurement Group Conference, pp. 717–725 (2003)Google Scholar
  11. 11.
    Williams, L.G., Smith, C.U.: PASA(SM): An Architectural Approach to Fixing Software Performance Problems. In: International Computer Measurement Group Conference, Computer Measurement Group, pp. 307–320 (2002)Google Scholar
  12. 12.
    Cortellessa, V., Frittella, L.: A Framework for Automated Generation of Architectural Feedback from Software Performance Analysis. In: Wolter, K. (ed.) EPEW 2007. LNCS, vol. 4748, pp. 171–185. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  13. 13.
    Parsons, T., Murphy, J.: Detecting Performance Antipatterns in Component Based Enterprise Systems. Journal of Object Technology 7, 55–91 (2008)CrossRefGoogle Scholar
  14. 14.
    Cortellessa, V., Di Marco, A., Eramo, R., Pierantonio, A., Trubiani, C.: Approaching the Model-Driven Generation of Feedback to Remove Software Performance Flaws. In: EUROMICRO-SEAA, pp. 162–169. IEEE Computer Society (2009)Google Scholar
  15. 15.
    Cortellessa, V., Di Marco, A., Trubiani, C.: Performance Antipatterns as Logical Predicates. In: Calinescu, R., Paige, R.F., Kwiatkowska, M.Z. (eds.) ICECCS, pp. 146–156. IEEE Computer Society (2010)Google Scholar
  16. 16.
    Barber, K.S., Graser, T.J., Holt, J.: Enabling Iterative Software Architecture Derivation Using Early Non-Functional Property Evaluation. In: ASE, pp. 172–182. IEEE Computer Society (2002)Google Scholar
  17. 17.
    Dobrzanski, L., Kuzniarz, L.: An approach to refactoring of executable UML models. In: Haddad, H. (ed.) ACM Symposium on Applied Computing (SAC), pp. 1273–1279. ACM (2006)Google Scholar
  18. 18.
    McGregor, J.D., Bachmann, F., Bass, L., Bianco, P., Klein, M.: Using arche in the classroom: One experience. Technical Report CMU/SEI-2007-TN-001, Software Engineering Institute, Carnegie Mellon University (2007)Google Scholar
  19. 19.
    Kavimandan, A., Gokhale, A.: Applying Model Transformations to Optimizing Real-Time QoS Configurations in DRE Systems. In: Mirandola, R., Gorton, I., Hofmeister, C. (eds.) QoSA 2009. LNCS, vol. 5581, pp. 18–35. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  20. 20.
    Object Management Group (OMG): Lightweight CCM RFP. OMG Document realtime/02-11-27 (2002)Google Scholar
  21. 21.
    Xu, J.: Rule-based automatic software performance diagnosis and improvement. Perform. Eval. 67, 585–611 (2010)CrossRefGoogle Scholar
  22. 22.
    Zheng, T., Woodside, M.: Heuristic Optimization of Scheduling and Allocation for Distributed Systems with Soft Deadlines. In: Kemper, P., Sanders, W.H. (eds.) TOOLS 2003. LNCS, vol. 2794, pp. 169–181. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  23. 23.
    Bondarev, E., Chaudron, M.R.V., de Kock, E.A.: Exploring performance trade-offs of a JPEG decoder using the deepcompass framework. In: International Workshop on Software and Performance, pp. 153–163 (2007)Google Scholar
  24. 24.
    Ipek, E., McKee, S.A., Singh, K., Caruana, R., de Supinski, B.R., Schulz, M.: Efficient architectural design space exploration via predictive modeling. ACM Transactions on Architecture and Code Optimization (TACO) 4 (2008)Google Scholar
  25. 25.
    Canfora, G., Penta, M.D., Esposito, R., Villani, M.L.: An approach for QoS-aware service composition based on genetic algorithms. In: Beyer, H.G., O’Reilly, U.M. (eds.) GECCO, pp. 1069–1075. ACM (2005)Google Scholar
  26. 26.
    Aleti, A., Björnander, S., Grunske, L., Meedeniya, I.: ArcheOpterix: An extendable tool for architecture optimization of AADL models. In: ICSE Workshop on Model-Based Methodologies for Pervasive and Embedded Software, pp. 61–71 (2009)Google Scholar
  27. 27.
    Martens, A., Koziolek, H., Becker, S., Reussner, R.: Automatically improve software architecture models for performance, reliability, and cost using evolutionary algorithms. In: WOSP/SIPEW International Conference on Performance Engineering, pp. 105–116 (2010)Google Scholar
  28. 28.
    Petriu, D.C., Shen, H.: Applying the UML Performance Profile: Graph Grammar-Based Derivation of LQN Models from UML Specifications. In: Field, T., Harrison, P.G., Bradley, J., Harder, U. (eds.) TOOLS 2002. LNCS, vol. 2324, pp. 159–177. Springer, Heidelberg (2002)Google Scholar
  29. 29.
    Feiler, P.H., Gluch, D.P., Hudak, J.J.: The Architecture Analysis and Design Language (AADL): An Introduction. Technical Report CMU/SEI-2006-TN-001, Software Engineering Institute, Carnegie Mellon University (2006)Google Scholar
  30. 30.
    Trubiani, C.: Automated generation of architectural feedback from software performance analysis results. PhD thesis, University of L’Aquila (2011)Google Scholar
  31. 31.
    Woodside, C.M.: A Three-View Model for Performance Engineering of Concurrent Software. IEEE Transactions on Software Engineering (TSE) 21, 754–767 (1995)CrossRefGoogle Scholar
  32. 32.
    Object Management Group (OMG): UML 2.0 Superstructure Specification. OMG Document formal/05-07-04 (2005)Google Scholar
  33. 33.
    Object Management Group (OMG): UML Profile for MARTE. OMG Document formal/08-06-09 (2009)Google Scholar
  34. 34.
    Cortellessa, V., Mirandola, R.: PRIMA-UML: a performance validation incremental methodology on early UML diagrams. Sci. Comput. Program. 44, 101–129 (2002)CrossRefzbMATHGoogle Scholar
  35. 35.
    Jain, R.: The Art of Computer Systems Performance Analysis: Techniques for Experimental Design, Measurement, Simulation, and Modeling. SIGMETRICS Performance Evaluation Review 19, 5–11 (1991)CrossRefzbMATHGoogle Scholar
  36. 36.
    Cortellessa, V., Martens, A., Reussner, R., Trubiani, C.: A Process to Effectively Identify “Guilty” Performance Antipatterns. In: Rosenblum, D.S., Taentzer, G. (eds.) FASE 2010. LNCS, vol. 6013, pp. 368–382. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  37. 37.
    Becker, S., Koziolek, H., Reussner, R.: The Palladio component model for model-driven performance prediction. Journal of Systems and Software 82, 3–22 (2009)CrossRefGoogle Scholar
  38. 38.
    Bernardo, M., Donatiello, L., Ciancarini, P.: Stochastic Process Algebra: From an Algebraic Formalism to an Architectural Description Language. In: Calzarossa, M.C., Tucci, S. (eds.) Performance 2002. LNCS, vol. 2459, pp. 236–260. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  39. 39.
    Bézivin, J.: On the unification power of models. Software and System Modeling 4, 171–188 (2005)CrossRefGoogle Scholar
  40. 40.
    Malavolta, I., Muccini, H., Pelliccione, P., Tamburri, D.A.: Providing Architectural Languages and Tools Interoperability through Model Transformation Technologies. IEEE Trans. Software Eng. 36, 119–140 (2010)CrossRefGoogle Scholar
  41. 41.
    Object Management Group (OMG): OCL 2.0 Specification. OMG Document formal/2006-05-01 (2006)Google Scholar
  42. 42.
    Stein, D., Hanenberg, S., Unland, R.: A Graphical Notation to Specify Model Queries for MDA Transformations on UML Models. In: Aßmann, U., Aksit, M., Rensink, A. (eds.) MDAFA 2003. LNCS, vol. 3599, pp. 77–92. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  43. 43.
    Chen, K., Sztipanovits, J., Abdelwalhed, S., Jackson, E.: Semantic Anchoring with Model Transformations. In: Hartman, A., Kreische, D. (eds.) ECMDA-FA 2005. LNCS, vol. 3748, pp. 115–129. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  44. 44.
    Dudney, B., Asbury, S., Krozak, J.K., Wittkopf, K.: J2EE Antipatterns (2003)Google Scholar
  45. 45.
    Tate, B., Clark, M., Lee, B., Linskey, P.: Bitter EJB (2003)Google Scholar
  46. 46.
    Lin, Y., Zhang, J., Gray, J.: Model Comparison: A Key Challenge for Transformation Testing and Version Control in Model Driven Software Development. In: OOPSLA Workshop on Best Practices for Model-Driven Software Development (2004)Google Scholar
  47. 47.
    Cicchetti, A., Di Ruscio, D., Pierantonio, A.: A Metamodel Independent Approach to Difference Representation. Journal of Object Technology 6, 165–185 (2007)CrossRefGoogle Scholar
  48. 48.
    Rivera, J.E., Vallecillo, A.: Representing and Operating with Model Differences. In: International Conference on TOOLS, pp. 141–160 (2008)Google Scholar
  49. 49.
    Trubiani, C.: A Model-Based Framework for Software Performance Feedback. In: Dingel, J., Solberg, A. (eds.) MODELS 2010 Workshops. LNCS, vol. 6627, pp. 19–34. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  50. 50.
    Cortellessa, V., Di Marco, A., Eramo, R., Pierantonio, A., Trubiani, C.: Digging into UML models to remove performance antipatterns. In: ICSE Workshop Quovadis, pp. 9–16 (2010)Google Scholar
  51. 51.
    Trubiani, C., Koziolek, A.: Detection and solution of software performance antipatterns in palladio architectural models. In: International Conference on Performance Engineering (ICPE), pp. 19–30 (2011)Google Scholar
  52. 52.
    Cortellessa, V., Martens, A., Reussner, R., Trubiani, C.: Towards the identification of “Guilty” performance antipatterns. In: WOSP/SIPEW International Conference on Performance Engineering, pp. 245–246 (2010)Google Scholar
  53. 53.
    So, S.S., Cha, S.D., Kwon, Y.R.: Empirical evaluation of a fuzzy logic-based software quality prediction model. Fuzzy Sets and Systems 127, 199–208 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  54. 54.
    Mens, T., Taentzer, G., Runge, O.: Detecting Structural Refactoring Conflicts Using Critical Pair Analysis. Electr. Notes Theor. Comput. Sci. 127, 113–128 (2005)CrossRefGoogle Scholar
  55. 55.
    Grunske, L.: Specification patterns for probabilistic quality properties. In: Schäfer, W., Dwyer, M.B., Gruhn, V. (eds.) ICSE, pp. 31–40. ACM (2008)Google Scholar
  56. 56.
    Feiler, P.H., Lewis, B.A., Vestal, S.: SAE, Architecture Analysis and Design Language (AADL), as5506/1 (2006),

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Vittorio Cortellessa
    • 1
  • Antinisca Di Marco
    • 1
  • Catia Trubiani
    • 1
  1. 1.Dipartimento di InformaticaUniversity of L’AquilaItaly

Personalised recommendations