Software Quality Journal

, Volume 20, Issue 2, pp 265–285 | Cite as

Faster issue resolution with higher technical quality of software

  • Dennis Bijlsma
  • Miguel Alexandre FerreiraEmail author
  • Bart Luijten
  • Joost Visser


We performed an empirical study of the relation between technical quality of software products and the issue resolution performance of their maintainers. In particular, we tested the hypothesis that ratings for source code maintainability, as employed by the Software Improvement Group (SIG) quality model, are correlated with ratings for issue resolution speed. We tested the hypothesis for issues of type defect and of type enhancement. This study revealed that all but one of the metrics of the SIG quality model show a significant positive correlation with the resolution speed of defects, enhancements, or both.


Software defects Defect resolution Maintainability Source code metrics Rank correlation Issue tracker mining 



Thanks to the developers of various open source systems for communication that helped us clean data and interpret results in our study.


  1. Air Force Operational Test & Evaluation Center (AFOTEC). (1989). Software maintainability—evaluation guide. AFOTEC Pamphlet 800-2, Vol. 3, HQ AFOTEC, Kirtland Air Force Base, New Mexico, USAGoogle Scholar
  2. Aversano, L., Cerulo, L., & Grosso, C. D. (2007). Learning from bug-introducing changes to prevent fault prone code. IWPSE ’07: Proceedings of 9th international workshop on principles of software evolution (pp. 19–26).Google Scholar
  3. Ayari, K., Meshkinfam, P., Antoniol, G., & Penta, M. D. (2007). Threats on building models from CVS and bugzilla repositories: The mozilla case study. CASCON ’07: Proceedings of 2007 conference of the center for advanced studies on collaborative research (pp. 215–228).Google Scholar
  4. Baggen, R., Schill, K., & Visser, J. (2010). Standardized code quality benchmarking for improving software maintainability. In 4th international workshop on software quality and maintainability (SQM 2010), March 15, 2010, Madrid, Spain.Google Scholar
  5. Benlarbi S., Emam K.E., Goel N., & Rai S. (2000) Thresholds for object-oriented measures. ISSRE ’00: Proceedings 11th international symposium on software reliability engineering (p. 24).Google Scholar
  6. Bird, C., Bachmann, A., Aune, E., & Duffy, J. (2009). Fair and balanced?: Bias in bug-fix datasets. ESEC/FSE ’09: Proceedings of 7th joint meeting of the European software engineering conference and the ACM SIGSOFT symposium on the foundations of software engineering (pp. 121–130).Google Scholar
  7. Briand, L., Wüst, J., Daly, J., & Porter, D. V. (2000). “Exploring the relationships between design measures and software quality in object-oriented systems. Journal of Systems & Software, 51(3), 245–273.CrossRefGoogle Scholar
  8. Chidamber, S., & Kemerer, C. (1994). A metrics suite for object oriented design. IEEE Transactions on Software Engineering, 20(6), 476–493.CrossRefGoogle Scholar
  9. Coleman, D. M., Ash, D., Lowther, B., & Oman, P. W. (1994). Using metrics to evaluate software system maintainability. IEEE Computer, 27(8), 44–49.CrossRefGoogle Scholar
  10. Coleman, D., Lowther, B., & Oman, P. (1995). The application of software maintainability models in industrial software systems. Journal of System Software, 29(1), 3–16.CrossRefGoogle Scholar
  11. Deursen, A. V., & Kuipers, T. (2003). Source-based software risk assessment. In Proceedings of international conference on software maintenance (p. 385). IEEE Computer Society.Google Scholar
  12. Ferneley, E. H. (1999). Design metrics as an aid to software maintenance: An empirical study. Journal of Software Maintenance, 11(1), 55–72.CrossRefGoogle Scholar
  13. Fischer, M., Pinzger, M., & Gall, H. (2003). Populating a release history database from version control and bug tracking systems. ICSM ’03: Proceedings of international conference on software maintenance 2003 (pp. 23–32).Google Scholar
  14. Graves, T., Karr, A., Marron, J., & Siy, H. (2000). Predicting fault incidence using software change history. IEEE Transactions on Software Engineering, 26(7), 653–661.CrossRefGoogle Scholar
  15. Gyimothy, T., Ferenc, R., & Siket, I. (2005). Empirical validation of object-oriented metrics on open source software for fault prediction. IEEE Transactions on Software Engineering, 31(10), 897–910.CrossRefGoogle Scholar
  16. Heitlager, I., Kuipers, T., & Visser, J. (2007). “A practical model for measuring maintainability. In 6th international conference on the quality of information and communications technology (QUATIC 2007) (pp. 30–39). IEEE Computer Society.Google Scholar
  17. International Organization for Standardization. (2001). ISO/IEC 9126-1: Software engineering-product quality-part 1: Quality model. Geneva, Switzerland.Google Scholar
  18. Kim, S., Zimmermann, T., Pan, K., & Whitehead, Jr. E. J. (2006). Automatic identification of bug-introducing changes. ASE ’06: Proceedings of 21st IEEE/ACM international conference on automated software engineering (pp. 81–90).Google Scholar
  19. Kim, S., Pan, K., & Whitehead, Jr. E. J. (2006). Memories of bug fixes. FSE ’06: Proceedings of 14th ACM SIGSOFT international symposium on foundations of software engineering (pp. 35–45).Google Scholar
  20. Kuipers, T., & Visser, J. (2007). Maintainability index revisited—position paper. In Special session on system quality and maintainability (SQM 2007) of the 11th European conference on software maintenance and reengineering (CSMR 2007), p. .
  21. Kuipers, T., Visser, J., & de Vries, G. (2007). Monitoring the quality of outsourced software. In J. van Hillegersberg, et al. (Eds.), Proceedings of international workshop on tools for managing globally distributed software development (TOMAG 2007), Center for Telematics and Information Technology, Netherlands.Google Scholar
  22. Lubsen, Z., Zaidman, A., & Pinzger, M. (2009). Using association rules to study the co-evolution of production & test code. MSR ’09: Proceedings of 6th international working conference on mining software repositories (pp. 151–154).Google Scholar
  23. Luijten, B. (2010). The influence of software maintainability on issue handling. Master’s thesis, Delft University of Technology.Google Scholar
  24. Luijten, B., & Visser, J. (2010). Faster defect resolution with higher technical quality of software. In 4th international workshop on software quality and maintainability (SQM 2010), March 15, 2010, Madrid, Spain.Google Scholar
  25. McCabe, T. (1976). A complexity measure. IEEE Transactions on Software Engineering, 2(4), 308–320.MathSciNetzbMATHCrossRefGoogle Scholar
  26. Misra, S. C. (2005). Modeling design/coding factors that drive maintainability of software systems. Software Quality Control, 13(3):297–320.Google Scholar
  27. Oman, P., & Hagemeister, J. (1994). Construction and testing of polynomials predicting software maintainability. Journal of System Software, 24(3), 251–266.CrossRefGoogle Scholar
  28. Oman, P. W., Hagemeister, J., & Ash, D. (1991). A definition and taxonomy for software maintainability. Software Engineering Test Laboratory, University of Idaho, Moscow, ID, USA, Tech. Rep. #91-08-TR.Google Scholar
  29. Perry, D. E., Porter, A. A., & Votta, L. G. (2000). Empirical studies of software engineering: a roadmap. In ICSE ’00: Proceedings of the conference on the future of software engineering (pp. 345–355). New York, NY, USA: ACM.Google Scholar
  30. Ratzinger, J., Pinzger, M., & Gall, H. (2007). EQ-Mine: Predicting short-term defects for software evolution. Lecture Notes in Compure Science, 4422, 12.CrossRefGoogle Scholar
  31. Riaz, M., Mendes, E., & Tempero, E. (2009). A systematic review of software maintainability prediction and metrics. In ESEM ’09: Proceedings of the 2009 3rd international symposium on empirical software engineering and measurement (pp. 367–377). Washington, DC, USA: IEEE Computer Society.Google Scholar
  32. Shibata K., Rinsaka K., Dohi, T., & Okamura, H. (2007). Quantifying software maintainability based on a fault-detection/correction model. In PRDC ’07: Proceedings of the 13th Pacific rim international symposium on dependable computing (pp. 35–42). Washington, DC, USA: IEEE Computer Society, 2007.Google Scholar
  33. Śliwerski, J., Zimmermann, T., & Zeller, A. (2005). “When do changes induce fixes?” MSR ’05: Proceedings of 2005 international workshop on mining software repositories (pp. 1–5).Google Scholar
  34. Spearman, C. (1907). Demonstration of formulae for true measurement of correlation. The American Journal of Psychology, 18(2), 161–169.CrossRefGoogle Scholar
  35. van Koten, C., & Gray, A. R. (2006). An application of bayesian network for predicting object-oriented software maintainability. Information & Software Technology, 48(1), 59–67.CrossRefGoogle Scholar
  36. Welker, K. D., Oman, P. W., & Atkinson, G. G. (1997). Development and application of an automated source code maintainability index. Journal of Software Maintenance, 9(3), 127–159.CrossRefGoogle Scholar
  37. Xu, J., Ho, D., & Capretz, L. (2008). An empirical validation of object-oriented design metrics for fault prediction. Journal of Computer Science, 4(7), 571–577.CrossRefGoogle Scholar
  38. Zaidman, A., Rompaey, B., Demeyer, S., & van Deursen, A. (2008). “Mining software repositories to study co-evolution of production & test code,” ICST ’08: Proceedings of 1st international conference on software testing, verification, and validation (pp. 220–229).Google Scholar
  39. Zhou, Y., & Leung, H. (2007). Predicting object-oriented software maintainability using multivariate adaptive regression splines. Journal of System Software, 80(8), 1349–1361.CrossRefGoogle Scholar
  40. Zimmermann, T., Nagappan, N., Gall, H., & Giger, E. (2009). Cross-project defect prediction: A large scale experiment on data versus domain versus process. ESEC/FSE ’09: Proceedings 7th joint meeting of the European software engineering conference and the ACM SIGSOFT symposium on the foundations of software engineering (pp. 91–100).Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  • Dennis Bijlsma
    • 1
  • Miguel Alexandre Ferreira
    • 2
    Email author
  • Bart Luijten
    • 3
  • Joost Visser
    • 2
  1. 1.University of AmsterdamAmsterdamThe Netherlands
  2. 2.Software Improvement GroupAmsterdamThe Netherlands
  3. 3.Delft University of TechnologyDelftThe Netherlands

Personalised recommendations