Towards Performance Tooling Interoperability: An Open Format for Representing Execution Traces

  • Dušan Okanović
  • André van Hoorn
  • Christoph Heger
  • Alexander Wert
  • Stefan Siegl
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9951)

Abstract

Execution traces capture information on a software system’s runtime behavior, including data on system-internal software control flows, performance, as well as request parameters and values. In research and industrial practice, execution traces serve as an important basis for model-based and measurement-based performance evaluation, e.g., for application performance monitoring (APM), extraction of descriptive and prescriptive models, as well as problem detection and diagnosis. A number of commercial and open-source APM tools that allow the capturing of execution traces within distributed software systems is available. However, each of the tools uses its own (proprietary) format, which means that each approach building on execution trace data is tool-specific.

In this paper, we propose the (OPEN.xtrace) format to enable data interoperability and exchange between APM tools and (SPE) approaches. Particularly, this enables SPE researchers to develop their approaches in a tool-agnostic and comparable manner. OPEN.xtrace is a community effort as part of the overall goal to increase interoperability of SPE/APM techniques and tools.

In addition to describing the OPEN.xtrace format and its tooling support, we evaluate OPEN.xtrace by comparing its modeling capabilities with the information that is available in leading APM tools.

Notes

Acknowledgements

This work is being supported by the German Federal Ministry of Education and Research (grant no. 01IS15004, diagnoseIT), by the German Research Foundation (DFG) in the Priority Programme “DFG-SPP 1593: Design For Future—Managed Software Evolution” (HO 5721/1-1, DECLARE), and by the Research Group of the Standard Performance Evaluation Corporation (SPEC RG, http://research.spec.org). Special thanks go to Alexander Bran, Alper Hidiroglu, and Manuel Palenga — Bachelor’s students at the University of Stuttgart — for their support in the evaluation of the APM tools.

References

  1. 1.
    AppDynamics—Application Performance Monitoring and Management. https://www.appdynamics.com/
  2. 2.
  3. 3.
    Dynatrace—Application Monitoring. http://www.dynatrace.com/en/application-monitoring/
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
    Ammons, G., Ball, T., Larus, J.R.: Exploiting hardware performance counters with flow and context sensitive profiling. In: Proceedings of the ACM SIGPLAN 1997 Conference on Programming Language Design and Implementation (PLDI 1997), pp. 85–96 (1997)Google Scholar
  9. 9.
    Binz, T., Breitenbücher, U., Kopp, O., Leymann, F.: TOSCA: portable automated deployment and management of cloud applications. In: Advanced Web Services, pp. 527–549 (2014)Google Scholar
  10. 10.
    Brambilla, M., Cabot, J., Wimmer, M.: Model-Driven Software Engineering in Practice, 1st edn. Morgan & Claypool Publishers, Williston (2012)Google Scholar
  11. 11.
    Brosig, F., Huber, N., Kounev, S.: Automated extraction of architecture-level performance models of distributed component-based systems. In: Proceedings of the 26th IEEE/ACM International Conference on Automated Software Engineering (ASE 2011), pp. 183–192 (2011)Google Scholar
  12. 12.
    Canfora, G., Penta, M.D., Cerulo, L.: Achievements and challenges in software reverse engineering. Commun. ACM 54(4), 142–151 (2011)CrossRefGoogle Scholar
  13. 13.
    Ciancone, A., Drago, M.L., Filieri, A., Grassi, V., Koziolek, H., Mirandola, R.: The KlaperSuite framework for model-driven reliability analysis of component-based systems. Softw. Syst. Model. 13(4), 1269–1290 (2014)CrossRefGoogle Scholar
  14. 14.
    Distributed Management Task Force: Common Information Model (CIM) Standard, February 2014. http://www.dmtf.org/standards/cim/
  15. 15.
    Elarde, J.V., Brewster, G.B.: Performance analysis of application response measurement (ARM) version 2.0 measurement agent software implementations. In: Proceedings of the 2000 IEEE International Performance, Computing, and Communications Conference (IPCCC 2000), pp. 190–198 (2000)Google Scholar
  16. 16.
    Fittkau, F., Finke, S., Hasselbring, W., Waller, J.: Comparing trace visualizations for program comprehension through controlled experiments. In: Proceedings of the 2015 IEEE 23rd International Conference on Program Comprehension (ICPC 2015), pp. 266–276 (2015)Google Scholar
  17. 17.
    Heger, C., van Hoorn, A., Okanović, D., Siegl, S., Wert, A.: Expert-guided automatic diagnosis of performance problems in enterprise applications. In: Proceedings of the 12th European Dependable Computing Conference (EDCC 2016). IEEE (2016, to appear)Google Scholar
  18. 18.
    van Hoorn, A., Waller, J., Hasselbring, W.: Kieker: a framework for application performance monitoring and dynamic software analysis. In: Proceedings of the 3rd ACM/SPEC International Conference on Performance Engineering (ICPE 2012), pp. 247–248 (2012)Google Scholar
  19. 19.
    Israr, T.A., Woodside, C.M., Franks, G.: Interaction tree algorithms to extract effective architecture and layered performance models from traces. J. Syst. Softw. 80(4), 474–492 (2007)CrossRefGoogle Scholar
  20. 20.
    Jacob, B., Lanyon-Hogg, R., Nadgir, D., Yassin, A.: A Practical Guide to the IBM Autonomic Computing Toolkit. IBM, Indianapolis (2004)Google Scholar
  21. 21.
    Kiciman, E., Fox, A.: Detecting application-level failures in component-based internet services. IEEE Trans. Neural Netw. 16(5), 1027–1041 (2005)CrossRefGoogle Scholar
  22. 22.
    Knüpfer, A., Brendel, R., Brunst, H., Mix, H., Nagel, W.E.: Introducing the open trace format (OTF). In: Alexandrov, V.N., Albada, G.D., Sloot, P.M.A., Dongarra, J. (eds.) ICCS 2006. LNCS, vol. 3992, pp. 526–533. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  23. 23.
    Kowall, J., Cappelli, W.: Magic quadrant for application performance monitoring (2014)Google Scholar
  24. 24.
    Lladó, C.M., Smith, C.U.: PMIF+: extensions to broaden the scope of supported models. In: Balsamo, M.S., Knottenbelt, W.J., Marin, A. (eds.) EPEW 2013. LNCS, vol. 8168, pp. 134–148. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  25. 25.
    NovaTec Consulting GmbH: inspectIT. http://www.inspectit.eu/
  26. 26.
    Parsons, T., Murphy, J.: Detecting performance antipatterns in component based enterprise systems. J. Object Technol. 7(3), 55–91 (2008)CrossRefGoogle Scholar
  27. 27.
    Rohr, M., van Hoorn, A., Giesecke, S., Matevska, J., Hasselbring, W., Alekseev, S.: Trace-context sensitive performance profiling for enterprise software applications. In: Proceedings of the SPEC International Performance Evaluation Workshop (SIPEW 2008), pp. 283–302 (2008)Google Scholar
  28. 28.
    SPEC Research Group: OPEN—APM interoperability initiative (2016). http://research.spec.org/apm-interoperability/
  29. 29.
    Vögele, C., van Hoorn, A., Schulz, E., Hasselbring, W., Krcmar, H.: WESSBAS: extraction of probabilistic workload specifications for load testing and performance prediction–a model-driven approach for session-based application systems. J. Softw. Syst. Model. (2016). Under revisionGoogle Scholar
  30. 30.
    Walter, J., van Hoorn, A., Koziolek, H., Okanovic, D., Kounev, S.: Asking “what”?, automating the “how”?: the vision of declarative performance engineering. In: Proceedings of the 7th ACM/SPEC on International Conference on Performance Engineering, pp. 91–94. ICPE 2016. ACM (2016)Google Scholar
  31. 31.
    Woodside, C.M., Petriu, D.C., Petriu, D.B., Shen, H., Israr, T., Merseguer, J.: Performance by unified model analysis (PUMA). In: Proceedings of the 5th Internation Workshop on Software and Performance (WOSP 2005), pp. 1–12 (2005)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Dušan Okanović
    • 1
  • André van Hoorn
    • 1
  • Christoph Heger
    • 2
  • Alexander Wert
    • 2
  • Stefan Siegl
    • 2
  1. 1.Institute of Software Technology, Reliable Software SystemsUniversity of StuttgartStuttgartGermany
  2. 2.CA Application Performance ManagementNovaTec Consulting GmbHLeinfelden-EchterdingenGermany

Personalised recommendations