Advertisement

Lessons Learned from Automated Analysis of Industrial UML Class Models (An Experience Report)  

  • Betty H. C. Cheng
  • Ryan Stephenson
  • Brian Berenbach
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3713)

Abstract

Automated analysis of object-oriented design models can provide insight into the quality of a given software design. Data obtained from automated analysis, however, is often too complex to be easily understood by a designer. This paper examines the use of an automated analysis tool on industrial software UML class models, where one set of models was created as part of the design process and the other was obtained from reverse engineering code. The analysis was performed by DesignAdvisor, a tool developed by Siemens Corporate Research, that supports metrics-based analysis and detection of design guideline violations. The paper describes the lessons learned from using the automated analysis techniques to assess the quality of these models. We also assess the impact of design pattern use in the overall quality of the models. Based on our lessons learned, identify design guidelines that would minimize the occurrence of these errors.

Keywords

Automate Analysis Design Pattern Design Error Software Metrics Lesson Learn 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Rational Software: The Rational Unified Process (2002) Google Scholar
  2. 2.
    Chidamber, S.R., Kemerer, C.F.: A metrics suite of object oriented design. IEEE Transactions on Software Engineering 20 (1994)Google Scholar
  3. 3.
    Eder, J., Kappel, G., Schrefl, M.: Coupling and cohesion in object-oriented systems. In: Conference on Information and Knowledge Management, Baltimore, USA (1992)Google Scholar
  4. 4.
    Li, W., Henry, S.: Object-oriented metrics that predict maintainability. Journal of Systems and Software 23, 111–122 (1993)CrossRefGoogle Scholar
  5. 5.
    Lorenz, M., Kidd, J.: Object-Oriented Software Metrics: A Practical Guide. Prentice-Hall, Englewood Cliffs (1994)Google Scholar
  6. 6.
    Harrison, R., Counsell, S., Nithi, R.: Coupling metrics for object-oriented design. In: 5th International Symposium on Software Metrics, pp. 150–157 (1998)Google Scholar
  7. 7.
    Tahvildari, L., Kontogiannis, K.: A metric-based approach to enhance design quality through meta-pattern transformations. In: Seventh European Conference on Software Maintenance and Reengineering, pp. 183–192 (2003)Google Scholar
  8. 8.
    Berenbach, B.: The evaluation of large, complex UML analysis and design models. In: 26th IEEE International Conference on Software Engineering, ICSE 2004 (2004)Google Scholar
  9. 9.
    Berenbach, B., Hartman, J.: DesignAdvisor, A UML-based Architectural Design Tool. Siemens Corporate Research (2002)Google Scholar
  10. 10.
    Gamma, E., Helm, R., Johnson, R., Vlissides, J.: Design Patterns: Elements of Reusable Object-Oriented Software. Addison-Wesley, Reading (1995)Google Scholar
  11. 11.
    The Object Modelling Group: OMG Unified Modelling Specification Version 1.5. Object Management Group, Needham MA (2003)Google Scholar
  12. 12.
    Babin, G., Lustman, F.: Formal data and behavior requirements engineering: a scenario based approach. In: 3rd annual IASTED International Conference on Software Engineering and Applications, pp. 119–125 (1999)Google Scholar
  13. 13.
    Jackson, M.: Formalism and informality in RE. In: Fifth IEEE International Symposium on Requirements Engineering, p. 269 (2001)Google Scholar
  14. 14.
    Li, X., Liu, Z., He, J.: Formal and use-case driven requirement analysis in UML. In: 25th Annual International Computer Software and Appliations Conference, pp. 215–224 (2001)Google Scholar
  15. 15.
    McUmber, W.E., Cheng, B.H.C.: A general framework for formalizing UML with formal languages. In: 23rd IEEE International Conference on Software Engineering (ICSE 2001), pp. 433–442 (2001)Google Scholar
  16. 16.
    Riel, A.J.: Object-Oriented Design Heuristics. Addison-Wesley, Reading (1996)Google Scholar
  17. 17.
    Campbell, L., Cheng, B.H.C., McUmber, W., Stirewalt, R.E.K.: Automatically detecting and visualizing errors in UML diagrams. Requirements Engineering Journal (2002)Google Scholar
  18. 18.
    Briand, L.C., Daly, J.W., Wüst, J.: A unified framework for cohesion measurement in object-oriented systems. Empirical Software Engineering: An International Journal 3, 65–117 (1998)CrossRefGoogle Scholar
  19. 19.
    Chae, H.S., Kwon, Y.R.: A cohesion measure for classes in object-oriented systems. In: 5th International Symposium on Software Metrics, pp. 158–166 (1998)Google Scholar
  20. 20.
    Erni, K., Lewerentz, C.: Applying design-metrics to object-oriented frameworks. In: 3rd International Software Metrics Symposium, pp. 64–74 (1996)Google Scholar
  21. 21.
    Muthanna, S., Ponnambalam, K., Kontogiannis, K., Stacey, B.: Kontogiannis, K., Stacey, B.: A maintainability model for industrial software systems using design level metrics. In: 7th Working Conference on Reverse Engineering, pp. 248–257 (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Betty H. C. Cheng
    • 1
  • Ryan Stephenson
    • 1
  • Brian Berenbach
    • 2
  1. 1.Software Engineering and Network Systems Laboratory, Department of Computer Science and EngineeringMichigan State UniversityEast LansingUSA
  2. 2.Siemens Corporate Research, Inc 

Personalised recommendations