Advertisement

Generating Software Documentation in Use Case Maps from Filtered Execution Traces

  • Edna Braun
  • Daniel Amyot
  • Timothy C. Lethbridge
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9369)

Abstract

One of the main issues in software maintenance is the time and effort needed to understand software. Software documentation and models are often incomplete, outdated, or non-existent, in part because of the cost and effort involved in creating and continually updating them. In this paper, we describe an innovative technique for automatically extracting and visualizing software behavioral models from execution traces. Lengthy traces are summarized by filtering out low-level software components via algorithms that utilize static and dynamic data. Eight such algorithms are compared in this paper. The traces are visualized using the Use Case Map (UCM) scenario notation. The resulting UCM diagrams depict the behavioral model of software traces and can be used to document the software. The tool-supported technique is customizable through different filtering algorithms and parameters, enabling the generation of documentation and models at different levels of abstraction.

Keywords

Feature location Software documentation Trace summarization Use Case Map Utility Visualization 

Notes

Acknowledgements

This work was sponsored in part by the Natural Sciences and Engineering Research Council of Canada (NSERC, Discovery grant).

References

  1. 1.
    Amyot, D., Mansurov, N., Mussbacher, G.: Understanding existing software with use case map scenarios. In: Sherratt, E. (ed.) SAM 2002. LNCS, vol. 2599, pp. 124–140. Springer, Heidelberg (2003). http://dx.doi.org/10.1007/3-540-36573-7_9 CrossRefGoogle Scholar
  2. 2.
    Amyot, D., Mussbacher, G.: User requirements notation: the first ten years, the next ten years. JSW 6(5), 747–768 (2011). http://dx.doi.org/10.4304/jsw.6.5.747-768
  3. 3.
    Ball, T.: The concept of dynamic analysis. In: Wang, J., Lemoine, M. (eds.) ESEC 1999 and ESEC-FSE 1999. LNCS, vol. 1687, p. 216. Springer, Heidelberg (1999). http://dx.doi.org/10.1007/3-540-48166-4_14 Google Scholar
  4. 4.
    Braun, E.: Reverse engineering behavioural models by filtering out utilities from execution traces. Ph.D. thesis, University of Ottawa, Canada (2013). http://hdl.handle.net/10393/26093
  5. 5.
    Briand, L., Labiche, Y., Leduc, J.: Toward the reverse engineering of UML sequence diagrams for distributed Java software. IEEE Trans. Softw. Eng. 32(9), 642–663 (2006)CrossRefGoogle Scholar
  6. 6.
    Buhr, R.: Use case maps as architectural entities for complex systems. IEEE Trans. Softw. Eng. 24(12), 1131–1155 (1998)CrossRefGoogle Scholar
  7. 7.
    Burtscher, M., Ganusov, I., Jackson, S., Ke, J., Ratanaworabhan, P., Sam, N.: The VPC trace-compression algorithms. IEEE Trans. Comput. 54(11), 1329–1344 (2005)CrossRefGoogle Scholar
  8. 8.
    Cornelissen, B., Zaidman, A., van Deursen, A., Moonen, L., Koschke, R.: A systematic survey of program comprehension through dynamic analysis. IEEE Trans. Softw. Eng. 35(5), 684–702 (2009)CrossRefGoogle Scholar
  9. 9.
    Di Lucca, G., Di Penta, M.: Integrating static and dynamic analysis to improve the comprehension of existing web applications. In: Web Site Evolution 2005, (WSE 2005), Seventh IEEE International Symposium, pp. 87–94, Sept 2005. http://dx.doi.org/10.1109/WSE.2005.8
  10. 10.
    Dit, B., Revelle, M., Gethers, M., Poshyvanyk, D.: Feature location in source code: a Taxonomy and survey. J. Softw. Evol. Process 25(1), 53–95 (2013). http://dx.doi.org/10.1002/smr.567 CrossRefGoogle Scholar
  11. 11.
    Dugerdil, P., Jossi, S.: Empirical assessment of execution trace segmentation in reverse-engineering. In: ICSOFT 2008, Volume SE/MUSE/GSDCA, pp. 20–27. INSTICC Press (2008)Google Scholar
  12. 12.
    Dugerdil, P., Repond, J.: Automatic generation of abstract views for legacy software comprehension. In: ISEC 2010, pp. 23–32. ACM, New York (2010). http://dx.doi.org/10.1145/1730874.1730881
  13. 13.
    Eclipse Foundation: Eclipse Test and Performance Tools Platform Project (2011). http://www.eclipse.org/tptp/
  14. 14.
    Eisenbarth, T., Koschke, R., Simon, D.: Locating features in source code. IEEE Trans. Softw. Eng. 29(3), 210–224 (2003). http://dx.doi.org/10.1109/TSE.2003.1183929 CrossRefGoogle Scholar
  15. 15.
    Hamou-Lhadj, A., Braun, E., Amyot, D., Lethbridge, T.: Recovering behavioral design models from execution traces. In: 9th Software Maintenance and Reengineering (CSMR), pp. 112–121, March 2005. http://dx.doi.org/10.1109/CSMR.2005.46
  16. 16.
    Hamou-Lhadj, A.: Techniques to simplify the analysis of execution traces for program comprehension. Ph.D. thesis, University of Ottawa, Canada (2006). http://hdl.handle.net/10393/29296
  17. 17.
    International Telecommunication Union: ITU-T Recommendation Z.151 (10/12) - User Requirements Notation (URN) - Language Definition (2012). http://www.itu.int/rec/T-REC-Z.151-201210-I
  18. 18.
    Lee, H.B., Zorn, B.G.: BIT: a tool for instrumenting Java bytecodes. In: USITS 1997, pp. 7–7. USENIX Association, Berkeley (1997)Google Scholar
  19. 19.
    von Mayrhauser, A., Vans, A.: Program understanding behavior during adaptation of large scale software. In: IWPC 1998, pp. 164–172, Jun 1998. http://dx.doi.org/10.1109/WPC.1998.693345
  20. 20.
    McCabe, T.J.: A complexity measure. IEEE Trans. Softw. Eng. 2(4), 308–320 (1976)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Mussbacher, G., Amyot, D.: Goal and scenario modeling, analysis, and transformation with jUCMNav. In: Software Engineering - Companion Volume, pp. 431–432, May 2009. http://dx.doi.org/10.1109/ICSE-COMPANION.2009.5071047
  22. 22.
    Richner, T., Ducasse, S.: Recovering high-level views of object-oriented applications from static and dynamic information. In: IEEE International Conference on Software Maintenance, pp. 13–22 (1999). http://dx.doi.org/10.1109/ICSM.1999.792487
  23. 23.
    Rilling, J., Seffah, A., Bouthlier, C.: The CONCEPT project - applying source code analysis to reduce information complexity of static and dynamic visualization techniques. In: Visualizing Software for Understanding and Analysis, pp. 90–99 (2002). http://dx.doi.org/10.1109/VISSOF.2002.1019798
  24. 24.
    Sauer, F.: Metrics 1.3.6 (2013). http://metrics.sourceforge.net/
  25. 25.
    Somé, S.S.: Use Case Editor (UCEd) (2007). http://www.site.uottawa.ca/ ssome/Use_Case_Editor_UCEd.html
  26. 26.
    Systä, T.: Understanding the behaviour of Java programs. In: WCRE 2000, pp. 214–223. IEEE CS (2000). http://dx.doi.org/10.1109/WCRE.2000.891472
  27. 27.
    Systä, T., Yu, P., Muller, H.: Analyzing Java software by combining metrics and program visualization. In: 4th Software Maintenance and Reengineering Conference, pp. 199–208, Feb 2000. http://dx.doi.org/10.1109/CSMR.2000.827328
  28. 28.
    Wang, Y., Li, Q., Chen, P., Ren, C.: Dynamic fan-in and fan-out metrics for program comprehension. J. Shanghai Univ. (English Edition) 11(5), 474–479 (2007). http://dx.doi.org/10.1007/s11741-007-0507-2 CrossRefGoogle Scholar
  29. 29.
    Wilde, N., Scully, M.C.: Software reconnaissance: mapping program features to code. J. Softw. Maintenance Res. Pract. 7(1), 49–62 (1995)CrossRefGoogle Scholar
  30. 30.
    Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., Wesslén, A.: Experimentation in Software Engineering: An Introduction. Kluwer Academic Publishers, Norwell (2000) CrossRefzbMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Edna Braun
    • 1
  • Daniel Amyot
    • 1
  • Timothy C. Lethbridge
    • 1
  1. 1.School of EECSUniversity of OttawaOttawaCanada

Personalised recommendations