Assessing Software Product Maintainability Based on Class-Level Structural Measures

  • Hans Christian Benestad
  • Bente Anda
  • Erik Arisholm
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4034)


A number of structural measures have been suggested to support the assessment and prediction of software quality attributes. The aim of our study is to investigate how class-level measures of structural properties can be used to assess the maintainability of a software product as a whole. We survey, structure and discuss current practices on this topic, and apply alternative strategies on four functionally equivalent systems that were constructed as part of a multi-case study. In the absence of historical data needed to build statistically based prediction models, we apply elements of judgment in the assessment. We show how triangulation of alternative strategies as well as sensitivity analysis may increase the confidence in assessments that contain elements of judgment. This paper contributes to more systematic practices in the application of structural measures. Further research is needed to evaluate and improve the accuracy and precision of judgment-based strategies.


Quality Attribute Software Quality Structural Measure Software Artifact Very High 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Boehm, B., In, H.: Identifying Quality-Requirement Conflicts. IEEE Software 13, 25–35 (1996)CrossRefGoogle Scholar
  2. 2.
    ISO/IEC, Software engineering — Product quality — Part 1: Quality model (2001)Google Scholar
  3. 3.
    Chidamber, S.R., Kemerer, C.F.: A Metrics Suite for Object Oriented Design. IEEE Transactions on Software Engineering 20, 476–493 (1994)CrossRefGoogle Scholar
  4. 4.
    Briand, L., Wuest, J.: Empirical Studies of Quality Models in Object-Oriented Systems. Advances in Computers 59, 97–166 (2002)CrossRefGoogle Scholar
  5. 5.
    Basili, V.R., Caldiera, G., Rombach, H.D.: Goal Question Metrics Paradigm. Encyclopedia of Software Engineering 1, 528–532 (1994)Google Scholar
  6. 6.
    Bansiya, J., Davis, C.G.: A Hierarchical Model for Object-Oriented Design Quality Assessment. IEEE Transactions on Software Engineering 28, 4–17 (2002)CrossRefGoogle Scholar
  7. 7.
    McCall, J., Richards, P., Walters, G.: Factors in Software Quality, General Electric Command & Information Systems Technical Report 77CIS02 to Rome Air Development Center, Sunnyvale, CA (1977)Google Scholar
  8. 8.
    McCabe: A complexity measure. IEEE Transactions on Software Engineering SE-2, 308–320 (1976)Google Scholar
  9. 9.
    Halstead, M.H.: Elements of Software Science, Operating, and Programming Systems Series, vol. 7 (1977)Google Scholar
  10. 10.
    Oman, P., Hagemeister, J.: Construction and Testing of Polynomials Predicting Software Maintainability. Journal of Systems and Software 24, 251–266 (1994)CrossRefGoogle Scholar
  11. 11.
    Darcy, D., Kemerer, C.F.: OO Metrics in Practice. IEEE Software 22, 17–19 (2005)CrossRefGoogle Scholar
  12. 12.
    Barnard, J.: A new reusability metric for object-oriented software. Software Quality Journal 7, 35–50 (1998)CrossRefGoogle Scholar
  13. 13.
    Briand, L., Wüst, J.: Integrating scenario-based and measurement-based software product assessment. Journal of Systems and Software 59, 3–22 (2001)CrossRefGoogle Scholar
  14. 14.
    Drake, T.: Measuring Software Quality: A Case Study. Computer 29, 78–87 (1996)CrossRefGoogle Scholar
  15. 15.
    Ferenc, R., Siket, I., Gyimothy, T.: Extracting Facts from Open Source Software. In: Proceedings of the 20th IEEE International Conference on Software Maintenance. IEEE Computer Society, Los Alamitos (2004)Google Scholar
  16. 16.
    Harrison, R., Counsell, S.J., Nithi, R.V.: An Evaluation of the MOOD Set of Object-Oriented Software Metrics. IEEE Transactions on Software Engineering 24, 491–496 (1998)CrossRefGoogle Scholar
  17. 17.
    Harrison, R., Smaraweera, L.G., Dobie, M.R., Lewis, P.H.: Comparing programming paradigms: an evaluation of functional and object-oriented programs. Software Engineering Journal 11, 247–254 (1996)CrossRefGoogle Scholar
  18. 18.
    Mayrand, J., Coallier, F.: System Acquisition Based on Software Product Assessment. In: 18th International Conference on Software Engineering, Berlin (1996)Google Scholar
  19. 19.
    Saboe, M.: The Use of Software Quality Metrics in the Materiel Release Process — Experience Report. In: Second Asia-Pacific Conference on Quality Software, Hong Kong (2001)Google Scholar
  20. 20.
    Schroeder, M.: A Practical Guide to Object-Oriented Metrics. IT Professional 1, 30–36 (1999)CrossRefGoogle Scholar
  21. 21.
    Sharble, R.C., Cohen, S.S.: The Object-Oriented Brewery: A Comparison of Two Object-Oriented Development Methods. SIGSOFT Software Engineering Notes 18, 60–73 (1993)CrossRefGoogle Scholar
  22. 22.
    Stamelos, I., Angelis, L., Oikonomou, A., Bleris, G.L.: Code quality analysis in open source software development. Information Systems Journal 12, 43–60 (2002)CrossRefGoogle Scholar
  23. 23.
    Gronback, R.C.: Software Remodeling: Improving Design and Implementation Quality, Borland (2003)Google Scholar
  24. 24.
    Abreu, F.e.: The MOOD Metrics Set. In: ECOOP 1995 Workshop Metrics (1995)Google Scholar
  25. 25.
    Dybå, T., Kitchenham, B.A., Jørgensen, M.: Evidence-based software engineering for practitioners. IEEE Software 22, 58–65 (2005)CrossRefGoogle Scholar
  26. 26.
    Kitchenham, B.A., Dybå, T., Jørgensen, M.: Evidence-based Software Engineering. In: Proceedings of the 26th International Conference on Software Engineering (ICSE), Edinburgh, Scotland (2004)Google Scholar
  27. 27.
    Jolliffe, I.T.: Principal Component Analysis, 2nd edn. Springer, New York (2002)zbMATHGoogle Scholar
  28. 28.
    Telelogic Tau Logiscope 6.1 Audit – Basic Concepts. Malmö, Sweden: Telelogic AB (2004)Google Scholar
  29. 29.
    Morisio, M., Stamelos, I., Tsoukias, A.: A New Method to Evaluate Software Artifacts Against Predefined Profiles. In: Proceedings of the 14th international conference on Software engineering and knowledge engineering. ACM Press, Ischia (2002)Google Scholar
  30. 30.
    Briand, L., Wüst, C.J., Daly, J.W., Porter, D.V.: Exploring the Relationship between Design Measures and Software Quality in Object-Oriented Systems. Journal of Systems and Software 51, 245–273 (2000)CrossRefGoogle Scholar
  31. 31.
    Welker, K.D., Oman, P.W., Atkinson, G.G.: Development and Application of an Automated Source Code Maintainability Index. Journal of Software Maintenance: Research and Practice 9, 127–159 (1997)CrossRefGoogle Scholar
  32. 32.
    Bieman, J.M., Kang, B.-K.: Cohesion and Reuse in an Object-Oriented System. In: Proceedings of the 1995 Symposium on Software reusability. ACM Press, Seattle (1995)Google Scholar
  33. 33.
    Briand, L., Devanbu, P., Melo, W.: An Investigation into Coupling Measures for C++. In: Proceedings of the 19th international conference on Software engineering. ACM Press, Boston (1997)Google Scholar
  34. 34.
    Lee, Y.S., Liang, B.S., Wu, S.F., Wang, F.J.: Measuring the Coupling and Cohesion of an Object-Oriented Program Based on Information Flow. In: Conference on Software Quality, Maribor, Slovenia (1995)Google Scholar
  35. 35.
    Li, W., Henry, S.: Object-Oriented Metrics that Predict Maintainability. Journal of Systems and Software 23, 111–122 (1993)CrossRefGoogle Scholar
  36. 36.
    Briand, L.C., Wüst, J.: The Impact of Design Properties on Development Cost in Object-Oriented Systems. In: Software Metrics Symposium, London, UK (2001)Google Scholar
  37. 37.
    Lake, A., Cook, C.: Use of Factor Analysis to Develop OOP Software Complexity Metrics. In: 6th Annual Oregon Workshop on Software Metrics, Silver Falls, Oregon (1994)Google Scholar
  38. 38.
    Tegarden, P.D., Sheetz, S.D., Monarchi, E.D.: A software complexity model of object-oriented systems. Decision Support Systems 13, 241–262 (1995)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Hans Christian Benestad
    • 1
  • Bente Anda
    • 1
  • Erik Arisholm
    • 1
  1. 1.Simula Research LaboratoryLysakerNorway

Personalised recommendations