Advertisement

Enabling Performance Modeling for the Masses: Initial Experiences

  • Abel Gómez
  • Connie U. Smith
  • Amy Spellmann
  • Jordi Cabot
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11150)

Abstract

Performance problems such as sluggish response time or low throughput are especially annoying, frustrating and noticeable to users. Fixing performance problems after they occur results in unplanned expenses and time. Our vision is an MDE-intensive software development paradigm for complex systems in which software designers can evaluate performance early in development, when the analysis can have the greatest impact. We seek to empower designers to do the analysis themselves by automating the creation of performance models out of standard design models. Such performance models can be automatically solved, providing results meaningful to them. In our vision, this automation can be enabled by using model-to-model transformations: First, designers create UML design models embellished with the Modeling and Analysis of Real Time and Embedded systems (MARTE) design specifications; and secondly, such models are transformed to automatically solvable performance models by using QVT. This paper reports on our first experiences when implementing these two initial activities.

Keywords

Experience Performance engineering UML MARTE QVT 

References

  1. 1.
    Balsamo, S., Di Marco, A., Inverardi, P., Simeoni, M.: Model-based performance prediction in software development: a survey. IEEE Trans. Softw. Eng. 30(5), 295–310 (2004)CrossRefGoogle Scholar
  2. 2.
    Becker, S., Koziolek, H., Reussner, R.: The palladio component model for model-driven performance prediction. J. Syst. Softw. 82(1), 3–22 (2009)CrossRefGoogle Scholar
  3. 3.
    Bernardi, S., et al.: A systematic approach for performance assessment using process mining. Empir. Softw. Eng. (2018).  https://doi.org/10.1007/s10664-018-9606-9
  4. 4.
    Bernardi, S., Merseguer, J., Petriu, D.C.: Dependability modeling and analysis of software systems specified with UML. ACM Comput. Surv. 45(1), 1–48 (2012)CrossRefGoogle Scholar
  5. 5.
    Celonis PI (2011). https://www.celonis.com. Accessed June 2018
  6. 6.
    Consortium, D.: Getting started with DICE: developing data-intensive cloud applications with iterative quality enhancements (2018). http://www.dice-h2020.eu/getting-started/. Accessed June 2018
  7. 7.
    Cortellessa, V., Marco, A.D., Inverardi, P.: Model-Based Software Performance Analysis, 1st edn. Springer Publishing Company, Incorporated (2011)CrossRefGoogle Scholar
  8. 8.
    Daemen, J., Rijmen, V.: The Design of Rijndael. Springer-Verlag New York Inc., Secaucus (2002)CrossRefGoogle Scholar
  9. 9.
    Demathieu, S.: MARTE tutorial: An OMG UML profile to develop Real-Time and Embedded systems. http://www.uml-sysml.org/documentation/marte-tutorial-713-ko/at_download/file. Accessed June 2018
  10. 10.
    Di Ruscio, D., Paige, R.F., Pierantonio, A.: Guest editorial to the special issue on success stories in model driven engineering. Sci. Comput. Program. 89(PB), 69–70 (2014).  https://doi.org/10.1016/j.scico.2013.12.006CrossRefGoogle Scholar
  11. 11.
    Diwan, A., Hauswirth, M., Mytkowicz, T., Sweeney, P.F.: TraceAnalyzer: a system for processing performance traces. Softw. Pract. Exp. 41(3), 267–282 (2011)Google Scholar
  12. 12.
    Günther, C.W., Rozinat, A.: Disco: discover your processes. BPM (Demos) 940, 40–44 (2012)Google Scholar
  13. 13.
    Huber, N., Brosig, F., Spinner, S., Kounev, S., Bähr, M.: Model-based self-aware performance and resource management using the descartes modeling language. IEEE Trans. Softw. Eng. 43(5), 432–452 (2017)CrossRefGoogle Scholar
  14. 14.
    JetBrains: Extensions-Kotlin Programming Language. https://kotlinlang.org/docs/reference/extensions.html. Accessed June 2018
  15. 15.
    Kent, S.: Model driven engineering. In: Proceedings of the Third International Conference on Integrated Formal Methods, IFM 2002. pp. 286–298. Springer-Verlag, London, UK (2002)Google Scholar
  16. 16.
    Khaitan, S.K., McCalley, J.D.: Design techniques and applications of cyberphysical systems: a survey. IEEE Syst. J. 9(2), 350–365 (2015)CrossRefGoogle Scholar
  17. 17.
    Kounev, S., Huber, N., Brosig, F., Zhu, X.: A model-based approach to designing self-aware IT systems and infrastructures. IEEE Comput. 49(7), 53–61 (2016).  https://doi.org/10.1109/MC.2016.198CrossRefGoogle Scholar
  18. 18.
    Leveson, N.G.: Safeware-System Safety and Computers: A Guide to Preventing Accidents and Losses Caused by Technology. Addison-Wesley (1995)Google Scholar
  19. 19.
    Lladó, C.M., Smith, C.U.: PMIF+: extensions to broaden the scope of supported models. In: Balsamo, M.S., Knottenbelt, W.J., Marin, A. (eds.) EPEW 2013. LNCS, vol. 8168, pp. 134–148. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-40725-3_11CrossRefGoogle Scholar
  20. 20.
    L&S Computer Technology Inc: SPE-ED+. http://spe-ed.com/. Accessed June 2018
  21. 21.
    Medina, J.: The UML profile for MARTE: modelling predictable real-time systems with UML. http://www.artist-embedded.org/docs/Events/2011/Models_for_SA/01-MARTE-SAM-Julio_Medina.pdf. Aaccessed June 2018
  22. 22.
    Moreno, G.A., Smith, C.U.: Performance analysis of real-time component architectures: an enhanced model interchange approach. Perform. Eval. 67(8), 612–633 (2010). Special Issue on Software and PerformanceCrossRefGoogle Scholar
  23. 23.
    OMG: Meta Object Facility (MOF) 2.0 Query/View/Transformation Specification, Version 1.3. http://www.omg.org/spec/QVT/1.3/
  24. 24.
    OMG: Meta Object Facility (MOF), Version 2.5.1. http://www.omg.org/spec/MOF/2.5.1/
  25. 25.
    OMG: Modeling and Analysis of Real-time Embedded Systems (MARTE), Version 1.1. http://www.omg.org/spec/MARTE/1.1/
  26. 26.
    OMG: UML Profile for Schedulability, Performance, & Time (SPTP), Version 1.1. http://www.omg.org/spec/SPTP/1.1/
  27. 27.
    OMG: Unified Modeling Language (UML), Version 2.5. http://www.omg.org/spec/UML/2.5/
  28. 28.
    Petriu, D.B., Woodside, M.: An intermediate metamodel with scenarios and resources for generating performance models from uml designs. Softw. Syst. Model. 6(2), 163–184 (2007).  https://doi.org/10.1007/s10270-006-0026-8CrossRefGoogle Scholar
  29. 29.
    ProM Tools (2017). http://www.promtools.org/doku.php. Accessed June 2018
  30. 30.
    QPR Process Analyzer (2011). https://www.qpr.com. Accessed June 2018
  31. 31.
    Selic, B., Gérard, S.: Modeling and Analysis of Real-Time and Embedded Systems with UML and MARTE: Developing Cyber-Physical Systems, 1st edn. Morgan Kaufmann Publishers Inc., San Francisco (2013)Google Scholar
  32. 32.
    Smith, C.U., Lladó, C.M., Puigjaner, R.: Model interchange format specifications for experiments, output and results. Comput. J. 54(5), 674–690 (2011).  https://doi.org/10.1093/comjnl/bxq065CrossRefGoogle Scholar
  33. 33.
    Smith, C.U., Lladó, C.M.: SPE for the internet of things and other real-time embedded systems. In: Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering Companion, pp. 227–232. ACM, New York 2017).  https://doi.org/10.1145/3053600.3053652
  34. 34.
    Smith, C.U., Williams, L.G.: Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software. Addison Wesley Longman Publishing Co., Inc. (2002)Google Scholar
  35. 35.
    Smith, C., Williams, L.: A performance model interchange format. J. Syst. Softw. 49(1), 63–80 (1999).  https://doi.org/10.1016/S0164-1212(99)00067-9CrossRefGoogle Scholar
  36. 36.
    Wallin, P., Johnsson, S., Axelsson, J.: Issues related to development of E/E product line architectures in heavy vehicles. In: 42nd Hawaii International Conference on System Sciences (2009)Google Scholar
  37. 37.
    Williams, L.G., Smith, C.U.: Information requirements for software performance engineering. In: Beilner, H., Bause, F. (eds.) TOOLS 1995. LNCS, vol. 977, pp. 86–101. Springer, Heidelberg (1995).  https://doi.org/10.1007/BFb0024309CrossRefGoogle Scholar
  38. 38.
    Woodside, M., Petriu, D.C., Merseguer, J., Petriu, D.B., Alhaj, M.: Transformation challenges: from software models to performance models. Softw. Syst. Model. 13(4), 1529–1552 (2014).  https://doi.org/10.1007/s10270-013-0385-xCrossRefGoogle Scholar
  39. 39.
    Woodside, M., Petriu, D.C., Petriu, D.B., Shen, H., Israr, T., Merseguer, J.: Performance by unified model analysis (PUMA). In: Proceedings of the 5th International Workshop on Software and Performance, WOSP 2005. pp. 1–12. ACM, New York (2005).  https://doi.org/10.1145/1071021.1071022

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Internet Interdisciplinary Institute (IN3)Universitat Oberta de Catalunya (UOC)BarcelonaSpain
  2. 2.L&S Computer Technology, IncAustinUSA
  3. 3.ICREABarcelonaSpain

Personalised recommendations