Enabling Performance Modeling for the Masses: Initial Experiences

  • Abel GómezEmail author
  • Connie U. Smith
  • Amy Spellmann
  • Jordi Cabot
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11150)


Performance problems such as sluggish response time or low throughput are especially annoying, frustrating and noticeable to users. Fixing performance problems after they occur results in unplanned expenses and time. Our vision is an MDE-intensive software development paradigm for complex systems in which software designers can evaluate performance early in development, when the analysis can have the greatest impact. We seek to empower designers to do the analysis themselves by automating the creation of performance models out of standard design models. Such performance models can be automatically solved, providing results meaningful to them. In our vision, this automation can be enabled by using model-to-model transformations: First, designers create UML design models embellished with the Modeling and Analysis of Real Time and Embedded systems (MARTE) design specifications; and secondly, such models are transformed to automatically solvable performance models by using QVT. This paper reports on our first experiences when implementing these two initial activities.


Experience Performance engineering UML MARTE QVT 


  1. 1.
    Balsamo, S., Di Marco, A., Inverardi, P., Simeoni, M.: Model-based performance prediction in software development: a survey. IEEE Trans. Softw. Eng. 30(5), 295–310 (2004)CrossRefGoogle Scholar
  2. 2.
    Becker, S., Koziolek, H., Reussner, R.: The palladio component model for model-driven performance prediction. J. Syst. Softw. 82(1), 3–22 (2009)CrossRefGoogle Scholar
  3. 3.
    Bernardi, S., et al.: A systematic approach for performance assessment using process mining. Empir. Softw. Eng. (2018).
  4. 4.
    Bernardi, S., Merseguer, J., Petriu, D.C.: Dependability modeling and analysis of software systems specified with UML. ACM Comput. Surv. 45(1), 1–48 (2012)CrossRefGoogle Scholar
  5. 5.
    Celonis PI (2011). Accessed June 2018
  6. 6.
    Consortium, D.: Getting started with DICE: developing data-intensive cloud applications with iterative quality enhancements (2018). Accessed June 2018
  7. 7.
    Cortellessa, V., Marco, A.D., Inverardi, P.: Model-Based Software Performance Analysis, 1st edn. Springer Publishing Company, Incorporated (2011)CrossRefGoogle Scholar
  8. 8.
    Daemen, J., Rijmen, V.: The Design of Rijndael. Springer-Verlag New York Inc., Secaucus (2002)CrossRefGoogle Scholar
  9. 9.
    Demathieu, S.: MARTE tutorial: An OMG UML profile to develop Real-Time and Embedded systems. Accessed June 2018
  10. 10.
    Di Ruscio, D., Paige, R.F., Pierantonio, A.: Guest editorial to the special issue on success stories in model driven engineering. Sci. Comput. Program. 89(PB), 69–70 (2014). Scholar
  11. 11.
    Diwan, A., Hauswirth, M., Mytkowicz, T., Sweeney, P.F.: TraceAnalyzer: a system for processing performance traces. Softw. Pract. Exp. 41(3), 267–282 (2011)Google Scholar
  12. 12.
    Günther, C.W., Rozinat, A.: Disco: discover your processes. BPM (Demos) 940, 40–44 (2012)Google Scholar
  13. 13.
    Huber, N., Brosig, F., Spinner, S., Kounev, S., Bähr, M.: Model-based self-aware performance and resource management using the descartes modeling language. IEEE Trans. Softw. Eng. 43(5), 432–452 (2017)CrossRefGoogle Scholar
  14. 14.
    JetBrains: Extensions-Kotlin Programming Language. Accessed June 2018
  15. 15.
    Kent, S.: Model driven engineering. In: Proceedings of the Third International Conference on Integrated Formal Methods, IFM 2002. pp. 286–298. Springer-Verlag, London, UK (2002)Google Scholar
  16. 16.
    Khaitan, S.K., McCalley, J.D.: Design techniques and applications of cyberphysical systems: a survey. IEEE Syst. J. 9(2), 350–365 (2015)CrossRefGoogle Scholar
  17. 17.
    Kounev, S., Huber, N., Brosig, F., Zhu, X.: A model-based approach to designing self-aware IT systems and infrastructures. IEEE Comput. 49(7), 53–61 (2016). Scholar
  18. 18.
    Leveson, N.G.: Safeware-System Safety and Computers: A Guide to Preventing Accidents and Losses Caused by Technology. Addison-Wesley (1995)Google Scholar
  19. 19.
    Lladó, C.M., Smith, C.U.: PMIF+: extensions to broaden the scope of supported models. In: Balsamo, M.S., Knottenbelt, W.J., Marin, A. (eds.) EPEW 2013. LNCS, vol. 8168, pp. 134–148. Springer, Heidelberg (2013). Scholar
  20. 20.
    L&S Computer Technology Inc: SPE-ED+. Accessed June 2018
  21. 21.
    Medina, J.: The UML profile for MARTE: modelling predictable real-time systems with UML. Aaccessed June 2018
  22. 22.
    Moreno, G.A., Smith, C.U.: Performance analysis of real-time component architectures: an enhanced model interchange approach. Perform. Eval. 67(8), 612–633 (2010). Special Issue on Software and PerformanceCrossRefGoogle Scholar
  23. 23.
    OMG: Meta Object Facility (MOF) 2.0 Query/View/Transformation Specification, Version 1.3.
  24. 24.
    OMG: Meta Object Facility (MOF), Version 2.5.1.
  25. 25.
    OMG: Modeling and Analysis of Real-time Embedded Systems (MARTE), Version 1.1.
  26. 26.
    OMG: UML Profile for Schedulability, Performance, & Time (SPTP), Version 1.1.
  27. 27.
    OMG: Unified Modeling Language (UML), Version 2.5.
  28. 28.
    Petriu, D.B., Woodside, M.: An intermediate metamodel with scenarios and resources for generating performance models from uml designs. Softw. Syst. Model. 6(2), 163–184 (2007). Scholar
  29. 29.
    ProM Tools (2017). Accessed June 2018
  30. 30.
    QPR Process Analyzer (2011). Accessed June 2018
  31. 31.
    Selic, B., Gérard, S.: Modeling and Analysis of Real-Time and Embedded Systems with UML and MARTE: Developing Cyber-Physical Systems, 1st edn. Morgan Kaufmann Publishers Inc., San Francisco (2013)Google Scholar
  32. 32.
    Smith, C.U., Lladó, C.M., Puigjaner, R.: Model interchange format specifications for experiments, output and results. Comput. J. 54(5), 674–690 (2011). Scholar
  33. 33.
    Smith, C.U., Lladó, C.M.: SPE for the internet of things and other real-time embedded systems. In: Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering Companion, pp. 227–232. ACM, New York 2017).
  34. 34.
    Smith, C.U., Williams, L.G.: Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software. Addison Wesley Longman Publishing Co., Inc. (2002)Google Scholar
  35. 35.
    Smith, C., Williams, L.: A performance model interchange format. J. Syst. Softw. 49(1), 63–80 (1999). Scholar
  36. 36.
    Wallin, P., Johnsson, S., Axelsson, J.: Issues related to development of E/E product line architectures in heavy vehicles. In: 42nd Hawaii International Conference on System Sciences (2009)Google Scholar
  37. 37.
    Williams, L.G., Smith, C.U.: Information requirements for software performance engineering. In: Beilner, H., Bause, F. (eds.) TOOLS 1995. LNCS, vol. 977, pp. 86–101. Springer, Heidelberg (1995). Scholar
  38. 38.
    Woodside, M., Petriu, D.C., Merseguer, J., Petriu, D.B., Alhaj, M.: Transformation challenges: from software models to performance models. Softw. Syst. Model. 13(4), 1529–1552 (2014). Scholar
  39. 39.
    Woodside, M., Petriu, D.C., Petriu, D.B., Shen, H., Israr, T., Merseguer, J.: Performance by unified model analysis (PUMA). In: Proceedings of the 5th International Workshop on Software and Performance, WOSP 2005. pp. 1–12. ACM, New York (2005).

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Internet Interdisciplinary Institute (IN3)Universitat Oberta de Catalunya (UOC)BarcelonaSpain
  2. 2.L&S Computer Technology, IncAustinUSA
  3. 3.ICREABarcelonaSpain

Personalised recommendations