Deriving Tailored UML Interaction Models from Scenario-Based Runtime Tests

  • Thorsten Haendler
  • Stefan Sobernig
  • Mark Strembeck
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 586)

Abstract

Documenting system behavior explicitly using graphical models (e.g. UML activity or sequence diagrams) facilitates communication about and understanding of software systems during development and maintenance tasks. Creating graphical models manually is a time-consuming and often error-prone task. Deriving models from system-execution traces, however, suffers from resulting model sizes which render the models unmanageable for humans. This paper describes an approach for deriving behavior documentation from runtime tests in terms of UML interaction models. Key to our approach is leveraging the structure of scenario-based runtime tests to render the resulting interaction models and diagrams tailorable by humans for a given task. Each derived model represents a particular view on the test-execution trace. This way, one can benefit from tailored graphical models while controlling the model size. The approach builds on conceptual mappings (transformation rules) between a test-execution trace metamodel and the UML2 metamodel. In addition, we provide means to turn selected details of test specifications and of testing environment (i.e. test parts and call scopes) into views on the test-execution trace (scenario-test viewpoint). A prototype implementation called KaleidoScope based on a software-testing framework (STORM) and model transformations (Eclipse M2M/QVTo) is available.

Keywords

Test-based documentation Scenario-based testing Test-execution trace Scenario-test viewpoint UML interactions UML sequence diagram 

References

  1. 1.
    Bennett, C., Myers, D., Storey, M.A., German, D.M., Ouellet, D., Salois, M., Charland, P.: A survey and evaluation of tool features for understanding reverse-engineered sequence diagrams. Softw. Maint. Evol. 20(4), 291–315 (2008). doi:10.1002/smr.v20:4 CrossRefGoogle Scholar
  2. 2.
    Briand, L.C., Labiche, Y., Miao, Y.: Toward the reverse engineering of UML sequence diagrams. In: Proceedings of WCRE 2003, pp. 57–66. IEEE (2003). doi:10.1109/TSE.2006.96
  3. 3.
    Carroll, J.M.: Five reasons for scenario-based design. Interact. Comput. 13(1), 43–60 (2000). doi:10.1016/S0953-5438(00)00023-0 CrossRefGoogle Scholar
  4. 4.
    Clements, P., Bachmann, F., Bass, L., Garlan, D., Ivers, J., Little, R., Merson, P., Nord, R., Stafford, J.: Documenting Software Architecture: Views and Beyond. SEI, 2nd edn. Addison-Wesley, Boston (2011)Google Scholar
  5. 5.
    Cornelissen, B., Van Deursen, A., Moonen, L., Zaidman, A.: Visualizing testsuites to aid in software understanding. In: Proceedings of CSMR 2007, pp. 213–222. IEEE (2007). doi:10.1109/CSMR.2007.54
  6. 6.
    Czarnecki, K., Helsen, S.: Classification of model transformation approaches. In: WS Proceedings of OOPSLA 2003, pp. 1–17. ACM Press (2003)Google Scholar
  7. 7.
    Delamare, R., Baudry, B., Le Traon, Y., et al.: Reverse-engineering of UML 2.0 sequence diagrams from execution traces. In: WS Proceedings of ECOOP 2006. Springer (2006)Google Scholar
  8. 8.
    Eclipse Foundation: Papyrus (2015). http://eclipse.org/papyrus/. Accessed 25 September 2015
  9. 9.
    Falessi, D., Briand, L.C., Cantone, G., Capilla, R., Kruchten, P.: The value of design rationale information. ACM Trans. Softw. Eng. Methodol. 22(3), 21:1–21:32 (2013). doi:10.1145/2491509.2491515 CrossRefGoogle Scholar
  10. 10.
    Fernández-Sáez, A.M., Genero, M., Chaudron, M.R., Caivano, D., Ramos, I.: Are forward designed or reverse-engineered UML diagrams more helpful for code maintenance? a family of experiments. Inform. Softw. Tech. 57, 644–663 (2015). doi:10.1016/j.infsof.2014.05.014 CrossRefGoogle Scholar
  11. 11.
    Grati, H., Sahraoui, H., Poulin, P.: Extracting sequence diagrams from execution traces using interactive visualization. In: Proceedings of WCRE 2010, pp. 87–96. IEEE (2010). doi:10.1109/WCRE.2010.18
  12. 12.
    Guéhéneuc, Y.G., Ziadi, T.: Automated reverse-engineering of UML v2.0 dynamic models. In: WS Proceedings of ECOOP 2005. Springer (2005)Google Scholar
  13. 13.
    Guerra, E., Lara, J., Kolovos, D.S., Paige, R.F., Santos, O.M.: Engineering model transformations with transML. Softw. Syst. Model. 12(3), 555–577 (2013). doi:10.1007/s10270-011-0211-2 CrossRefGoogle Scholar
  14. 14.
    Haendler, T.: KaleidoScope. Institute for Information Systems and New Media. WU Vienna (2015). http://nm.wu.ac.at/nm/haendler. Accessed 25 September 2015
  15. 15.
    Haendler, T., Sobernig, S., Strembeck, M.: An approach for the semi-automated derivation of UML interaction models from scenario-based runtime tests. In: Proceedings of ICSOFT-EA 2015, pp. 229–240. SciTePress (2015). doi:10.5220/0005519302290240
  16. 16.
    Hamou-Lhadj, A., Lethbridge, T.C.: A survey of trace exploration tools and techniques. In: Proceedings of CASCON 2004, pp. 42–55. IBM Press (2004). http://dl.acm.org/citation.cfm?id=1034914.1034918
  17. 17.
    Jacobson, I.: Object-Oriented Software Engineering: A Use Case Driven Approach. ACM Press Series. ACM Press, New York (1992)MATHGoogle Scholar
  18. 18.
    Jarke, M., Bui, X.T., Carroll, J.M.: Scenario management: an interdisciplinary approach. Requirements Eng. 3(3), 155–173 (1998). doi:10.1007/s007660050002 CrossRefGoogle Scholar
  19. 19.
    Kanstrén, T.: Towards a deeper understanding of test coverage. Softw. Maint. Evol. 20(1), 59–76 (2008). doi:10.1002/smr.362 CrossRefGoogle Scholar
  20. 20.
    Lo, D., Maoz, S.: Mining scenario-based triggers and effects. In: Proceedings of ASE 2008, pp. 109–118. IEEE (2008). doi:10.1109/ASE.2008.21
  21. 21.
    Nebut, C., Fleurey, F., Le Traon, Y., Jezequel, J.: Automatic test generation: a use case driven approach. IEEE Trans. Softw. Eng. 32(3), 140–155 (2006). doi:10.1109/TSE.2006.22 CrossRefGoogle Scholar
  22. 22.
    Neumann, G., Sobernig, S.: Next scripting framework. API reference (2015). https://next-scripting.org/xowiki/. Accessed 25 September 2015
  23. 23.
    Neumann, G., Zdun, U.: Filters as a language support for design patterns in object-oriented scripting languages. In: Proceedings of COOTS 1999, pp. 1–14. USENIX (1999). http://dl.acm.org/citation.cfm?id=1267992
  24. 24.
    Object Management Group: Object Constraint Language (OCL) - Version 2.4 (2014). http://www.omg.org/spec/OCL/2.4/. Accessed 25 September 2015
  25. 25.
    Object Management Group: Meta Object Facility (MOF) 2.0 Query/View/Transformation Specification, Version 1.2, February 2015. http://www.omg.org/spec/QVT/1.2/. Accessed 25 September 2015
  26. 26.
    Object Management Group: MOF 2 XMI Mapping Specification, Version 2.5.1, June 2015. http://www.omg.org/spec/XMI/2.5.1/. Accessed 25 September 2015
  27. 27.
    Object Management Group: Unified Modeling Language (UML), Superstructure, Version 2.5.0, June 2015. http://www.omg.org/spec/UML/2.5. Accessed 25 September 2015
  28. 28.
    Oechsle, R., Schmitt, T.: JAVAVIS: Automatic program visualization with object and sequence diagrams using the java debug interface (JDI). In: Diehl, S. (ed.) Dagstuhl Seminar 2001. LNCS, vol. 2269, pp. 176–190. Springer, Heidelberg (2002). doi:10.1007/3-540-45875-1_14 CrossRefGoogle Scholar
  29. 29.
    Parizi, R.M., Lee, S.P., Dabbagh, M.: Achievements and challenges in state-of-the-art software traceability between test and code artifacts. Trans. Reliab. IEEE 63, 913–926 (2014). doi:10.1109/TR.2014.2338254 CrossRefGoogle Scholar
  30. 30.
    Qusef, A., Bavota, G., Oliveto, R., de Lucia, A., Binkley, D.: Recovering test-to-code traceability using slicing and textual analysis. J. Syst. Softw. 88, 147–168 (2014). doi:10.1016/j.jss.2013.10.019 CrossRefGoogle Scholar
  31. 31.
    Richner, T., Ducasse, S.: Recovering high-level views of object-oriented applications from static and dynamic information. In: Proceedings of ICSM 1999, pp. 13–22. IEEE (1999). http://dl.acm.org/citation.cfm?id=519621.853375
  32. 32.
    Ryser, J., Glinz, M.: A scenario-based approach to validating and testing software systems using statecharts. In: Proceedings of ICSSEA 1999 (1999)Google Scholar
  33. 33.
    Sharp, R., Rountev, A.: Interactive exploration of UML sequence diagrams. In: Proceedings of VISSOFT 2005, pp. 1–6. IEEE (2005). doi:10.1109/VISSOF.2005.1684295
  34. 34.
    Strembeck, M.: Testing policy-based systems with scenarios. In: Proceedings of IASTED 2011, pp. 64–71. ACTA Press (2011). doi:10.2316/P.2011.720-021
  35. 35.
    Van Geet, J., Zaidman, A., Greevy, O., Hamou-Lhadj, A.: A lightweight approach to determining the adequacy of tests as documentation. In: Proceedings of PCODA 2006, pp. 21–26. IEEE CS (2006)Google Scholar
  36. 36.
    Zdun, U.: Patterns of tracing software structures and dependencies. In: Proceedings of EuroPLoP 2003, pp. 581–616. Universitaetsverlag Konstanz (2003)Google Scholar
  37. 37.
    Zdun, U., Strembeck, M., Neumann, G.: Object-based and class-based composition of transitive mixins. Inform. Softw. Tech. 49(8), 871–891 (2007). doi:10.1016/j.infsof.2006.10.001 CrossRefGoogle Scholar
  38. 38.
    Ziadi, T., Da Silva, M.A.A., Hillah, L.M., Ziane, M.: A fully dynamic approach to the reverse engineering of UML sequence diagrams. In: Proceedings of ICECCS 2011, pp. 107–116. IEEE (2011). doi:10.1109/ICECCS.2011.18

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Thorsten Haendler
    • 1
  • Stefan Sobernig
    • 1
  • Mark Strembeck
    • 1
  1. 1.Institute for Information Systems and New MediaVienna University of Economics and Business (WU)ViennaAustria

Personalised recommendations