Comparing Software Architecture Descriptions and Raw Source-Code: A Statistical Analysis of Maintainability Metrics

  • Eudisley Anjos
  • Fernando Castor
  • Mário Zenha-Rela
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7973)


The software systems have been exposed to constant changes in a short period of time. It requires high maintainable systems and makes maintainability one of the most important quality attributes. In this work we performed a statistical analysis of maintainability metrics in three mainstream open-source applications, Tomcat (webserver), Jedit (text editor) and Vuze (a peer to peer client). The metrics are applied to source-code and to derived similar architectural metrics using scatter plot, Pearson’s correlation coefficient and significance tests. The observations contradict the common assumption that software quality attributes (aka non-functional requirements) are mostly determined at the architectural level and raise new issues for future works in this field.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    jedit: Programmer’s text editor (June 2012)Google Scholar
  2. 2.
    Vuze: A bittorrent application (June 2012)Google Scholar
  3. 3.
    e Abreu, F.B., Pereira, G., Sousa, P.: A coupling-guided cluster analysis approach to reengineer the modularity of object-oriented systems. In: Proceedings of the Conference on Software Maintenance and Reengineering, CSMR 2000, p. 13. IEEE Computer Society, Washington, DC (2000)Google Scholar
  4. 4.
    Shaik, B.M.A., Reddy, C.R.K.: Empirically investigating the effect of design metrics on fault proneness in object oriented systems. International Journal of Computer Science & Engineering Technology (IJCSET) 2(4), 97–101Google Scholar
  5. 5.
    Anjos, E., Zenha-Rela, M.: A framework for classifying and comparing software architecture tools for quality evaluation. In: Murgante, B., Gervasi, O., Iglesias, A., Taniar, D., Apduhan, B.O. (eds.) ICCSA 2011, Part V. LNCS, vol. 6786, pp. 270–282. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  6. 6.
    Barcelos, Travassos: Evaluation approaches for software architectural documents: A systematic review. In: Ibero-American Workshop on Requirements Engineering and Software Environments, IDEAS (2006)Google Scholar
  7. 7.
    Bengtsson, P.: Towards maintainability metrics on software architecture: An adaptation of object-oriented metrics. In: First Nordic Workshop on Software Architecture, NOSA 1998 (1998)Google Scholar
  8. 8.
    Bode, S.: On the role of evolvability for architectural design. In: Fischer, S., Maehle, E., Reischuk, R. (eds.) GI Jahrestagung. LNI, vol. 154, pp. 3256–3263. GI (2009)Google Scholar
  9. 9.
    Breivold, H.P., Crnkovic, I.: A Systematic Review on Architecting for Software Evolvability, pp. 13–22. IEEE (2010)Google Scholar
  10. 10.
    Brittain, J., Darwin, I.F.: Tomcat: The definitive guide (2003)Google Scholar
  11. 11.
    Cai, Y., Huynh, S.: An evolution model for software modularity assessment. In: Proceedings of the 5th International Workshop on Software Quality, WoSQ 2007, p. 3. IEEE Computer Society, Washington, DC (2007)Google Scholar
  12. 12.
    Chae, H.S., Kwon, Y.R., Bae, D.H.: A cohesion measure for object-oriented classes. Softw. Pract. Exper. 30, 1405–1431 (2000)zbMATHCrossRefGoogle Scholar
  13. 13.
    Chidamber, S.R., Kemerer, C.F.: A metrics suite for object oriented design. IEEE Trans. Softw. Eng. 20(6), 476–493 (1994)CrossRefGoogle Scholar
  14. 14.
    Clements, P., Kazman, R., Klein, M.: Evaluating Software Architectures: Methods and Case Studies. Addison-Wesley (2001)Google Scholar
  15. 15.
    Dobrica, L., Niemela, E.: A Survey on Software Architecture Analysis Methods. IEEE Transactions on Software Engineering 28(7), 638–653 (2002)CrossRefGoogle Scholar
  16. 16.
    dos Anjos, E.G., Gomes, R.D., Zenha-Rela, M.: Assessing maintainability metrics in software architectures using COSMIC and UML. In: Murgante, B., Gervasi, O., Misra, S., Nedjah, N., Rocha, A.M.A.C., Taniar, D., Apduhan, B.O. (eds.) ICCSA 2012, Part IV. LNCS, vol. 7336, pp. 132–146. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  17. 17.
    Glass, R.L.: Facts and Fallacies of Software Engineering. Addison-Wesley (2002)Google Scholar
  18. 18.
    IEEE Architecture Working Group. Ieee std 1471-2000, recommended practice for architectural description of software-intensive systems. Technical report, IEEE (2000)Google Scholar
  19. 19.
    Halstead, M.H.: Elements of software science (Operating and programming systems series). Elsevier (1977)Google Scholar
  20. 20.
    Hashim, K., Key, E.: A software maintainability attributes model. Malaysian Journal of Computer Science 9(2) (1996)Google Scholar
  21. 21.
    Iso. International standard - iso/iec 14764 ieee std 14764-2006. ISO/IEC 14764:2006 (E) IEEE Std 14764-2006 Revision of IEEE Std 1219-1998, pp. 1–46 (2006)Google Scholar
  22. 22.
    Mustofa, A.W., Rahardjo, R., Wardoyo, J.E., Instiyanto, K.: Statistical analysis on software metrics affecting modularity in open-source software. Academy & Industry Research Collaboration Center (AIRCC) 3(3), 105–118 (2011)Google Scholar
  23. 23.
    Kazman, R., Bass, L., Klein, M., Lattanze, T., Northrop, L.: A basis for analyzing software architecture analysis methods. Software Quality Control 13(4), 329–355 (2005)CrossRefGoogle Scholar
  24. 24.
    Koziolek, H.: Sustainability evaluation of software architectures: A systematic review. In: Proceedings of the Joint ACM SIGSOFT Conference QoSA and ACM SIGSOFT Symposium, QoSA-ISARCS 2011, pp. 3–12. ACM (2011)Google Scholar
  25. 25.
    Lundberg, L., Bosch, J., Hggander, D., Bengtsson, P.-O.: Quality attributes in software architecture design. In: Proceedings of the IASTED 3rd International Conference on Software Engineering and Applications, pp. 353–362 (1999)Google Scholar
  26. 26.
    McCabe, T.J.: A complexity measure. In: Proceedings of the 2nd International Conference on Software Engineering, ICSE 1976, p. 407. IEEE Computer Society Press, Los Alamitos (1976)Google Scholar
  27. 27.
    Babu, S., Parvathi, R.M.S.: Development of dynamic coupling measurement of distributed object oriented software based on trace events. International Journal of Software Engineering and Applications 3(1), 165–179 (2012)CrossRefGoogle Scholar
  28. 28.
    Pressman, R., Pressman, R.: Software Engineering: A Practitioner’s Approach, 6th edn. McGraw-Hill Science/Engineering/Math (2004)Google Scholar
  29. 29.
    Riaz, M., Mendes, E., Tempero, E.: A systematic review of software maintainability prediction and metrics, pp. 367–377 (October 2009)Google Scholar
  30. 30.
    Rowe, D., Leaney, J.: Evaluating evolvability of computer based systems architectures - an ontological approach. In: Proceedings of the 1997 International Conference on Engineering of Computer-based Systems, ECBS 1997, pp. 360–367. IEEE Computer Society, Washington, DC (1997)Google Scholar
  31. 31.
    Saraiva, J., Soares, S., Castor, O.: A metrics suite to evaluate the impact of aosd on layered software architectures. In: 2nd Workshop on Empirical Evaluation o Software Composition Techniques (ESCOT 2011), Lancaster, UK (2011)Google Scholar
  32. 32.
    Shaik, A., Reddy, C.R.K., Manda, B., Prakashini, C., Deepthi, K.: An empirical validation of object oriented design metrics in object oriented systems. Applied Sciences 1(2), 216–224 (2010)Google Scholar
  33. 33.
    The Institute of Electrical and Eletronics Engineers. Ieee standard glossary of software engineering terminology. IEEE Standard (1990)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Eudisley Anjos
    • 1
    • 2
  • Fernando Castor
    • 3
  • Mário Zenha-Rela
    • 1
  1. 1.CISUC, Centre for Informatics and SystemsUniversity of CoimbraPortugal
  2. 2.CI, Informatic CenterFederal University of ParaibaBrazil
  3. 3.Cin, Informatic CenterFederal University of PernambucoBrazil

Personalised recommendations