Towards a Unified Definition of Open Source Quality

  • Claudia Ruiz
  • William Robinson
Part of the IFIP Advances in Information and Communication Technology book series (IFIPAICT, volume 365)

Abstract

Software quality needs to be specified and evaluated in order to determine the success of a development project, but this is a challenge with Free/Libre Open Source Software (FLOSS) because of its permanently emergent state. This has not deterred the growth of the assumption that FLOSS is higher quality than traditionally developed software, despite of mixed research results. With this literature review, we found the reason for these mixed results is that that quality is being defined, measured, and evaluated differently. We report the most popular definitions, such as software structure measures, process measures, such as defect fixing, and maturity assessment models. The way researchers have built their samples has also contributed to the mixed results with different project properties being considered and ignored. Because FLOSS projects are evolving, their quality is too, and it must be measured using metrics that take into account its community’s commitment to quality rather than just its software structure. Challenges exist in defining what constitutes a defect or bug, and the role of modularity in affecting FLOSS quality.

Keywords

open source software quality measurement literature review 

References

  1. 1.
    Stamelos, I., Angelis, L., Oikonomou, A., Bleris, G.L.: Code Quality Analysis in Open Source Software Development. Information Systems Journal 12, 43–60 (2002)CrossRefGoogle Scholar
  2. 2.
    Paulson, J.W., Succi, G., Eberlein, A.: An empirical study of open-source and closed-source software products. IEEE Transactions on Software Engineering 30, 246–256 (2004)CrossRefGoogle Scholar
  3. 3.
    Kuan, J.: Open Source Software as Lead-User’s Make or Buy Decision: A Study of Open and Closed Source Quality. In: Second Conference on The Economics of the Software and Internet Industries (2003)Google Scholar
  4. 4.
    Raghunathan, S., Prasad, A., Mishra, B.K., Chang, H.: Open source versus closed source: software quality in monopoly and competitive markets. IEEE Transactions on Systems, Man and Cybernetics, Part A 35, 903–918 (2005)CrossRefGoogle Scholar
  5. 5.
    Crowston, K., Wei, K., Howison, J., Wiggins, A.: Free/Libre Open Source Software Development: What We Know and What We Do Not Know. ACM Computing Surveys 44 (2012)Google Scholar
  6. 6.
    Ghosh, R.A.: Economic Impact of Open Source Software on Innovation and the Competitiveness of the Information and Communication Technologies Sector in the E. U (2006)Google Scholar
  7. 7.
    von Hippel, E., von Krogh, G.: Open Source Software and the “Private-Collective” Innovation Model: Issues for Organization Science. Organization Science 14, 209–223 (2003)CrossRefGoogle Scholar
  8. 8.
    Hales, P.: Firefox use continues to rise in Europe. The Inquirer (2006)Google Scholar
  9. 9.
    Scacchi, W.: Understanding Requirements for Open Source Software. In: Lyytinen, K., Loucopoulos, P., Mylopoulos, J., Robinson, B. (eds.) Design Requirements Engineering. LNBIP, vol. 14, pp. 467–494. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  10. 10.
    Stewart, K.J., Gosain, S.: The Impact of Ideology on Effectiveness in Open Source Software Development Teams. MIS Quarterly 30, 291–314 (2006)Google Scholar
  11. 11.
    Ajila, S.A., Wu, D.: Empirical study of the effects of open source adoption on software development economics. Journal of Systems and Software 80, 1517–1529 (2007)CrossRefGoogle Scholar
  12. 12.
    Bonaccorsi, A., Rossi, C.: Comparing Motivations of Individual Programmers and Firms to Take Part in the Open Source Movement. Knowledge, Technology and Policy 18, 40–64 (2006)CrossRefGoogle Scholar
  13. 13.
    Raymond, E.: The cathedral and the bazaar. Knowledge, Technology, and Policy 12, 23–49 (1999)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Rigby, P.C., German, D.M., Storey, M.-A.: Open source software peer review practices: a case study of the apache server. In: 0th International Conference on Software Engineering (ICSE 2008), Leipzig, Germany, pp. 541–550 (2008)Google Scholar
  15. 15.
    The Open Source Initiative, http://www.opensource.org/docs/osd
  16. 16.
    Ye, Y., Kishida, K.: Toward an Understanding of the Motivation of Open Source Software Developers. In: Proceedings of the 25th International Conference on Software Engineering, pp. 419–429 (2003)Google Scholar
  17. 17.
    Mockus, A., Fielding, R.T., Herbsleb, J.D.: Two case studies of open source software development: Apache and Mozilla. ACM Trans. Softw. Eng. Methodol. 11, 309–346 (2002)CrossRefGoogle Scholar
  18. 18.
    Schwaber, K.: Agile Project Management with Scrum. Microsoft Press (2004)Google Scholar
  19. 19.
    Deming, W.E.: Quality, Productivity, and Competitive Position. MIT Center for Advanced Engineering Study, Cambridge (1982)Google Scholar
  20. 20.
    Deming, W.E.: Out of the Crisis. MIT Center for Advanced Engineering Study, Cambridge (1986)Google Scholar
  21. 21.
    Juran, J.M.: Planning for Quality. Collier Macmillan, London (1988)Google Scholar
  22. 22.
    Crosby, P.B.: Quality is Free: The Art of Making Quality Certain. McGraw-Hill, New York (1979)Google Scholar
  23. 23.
    Feigenbaum, A.: Total quality control: engineering and management: the technical and managerial field for improving product quality, including its reliability, and for reducing operating costs and losses. McGraw-Hill, New York (1961)Google Scholar
  24. 24.
    Ishikawa, K.: What is total quality control? The Japanese way. Prentice-Hall, Englewood Cliffs (1985)Google Scholar
  25. 25.
    Tennant, G.: Six Sigma: SPC and TQM in manufacturing and services. Gower Publishing (2001)Google Scholar
  26. 26.
    Garvin, D.A.: What does ’Product Quality’ really mean? Sloan Management Review 1, 25–43 (1984)Google Scholar
  27. 27.
    Boehm, B.W., Brown, J.R., Lipow, M.: Quantitative evaluation of software quality. In: Proceedings of the 2nd International Conference on Software Engineering. IEEE Computer Society Press, San Francisco (1976)Google Scholar
  28. 28.
    Cavano, J.P., McCall, J.A.: A Framework for the Measurement of Software Quality. In: Proceedings of the ACM Software Quality Workshop, pp. 133–139. ACM, New York (1978)Google Scholar
  29. 29.
    McCall, J.A., Richards, P.K., Walters, G.F.: Factors in Software Quality. National Technology Information Service 1, 2, 3 (1977)Google Scholar
  30. 30.
    ISO: ISO 9126-1:2001, Software engineering - Product quality, Part 1: Quality model (2001) Google Scholar
  31. 31.
    Walsham, G.: The Emergence of Interpretivism in IS Research. Information Systems Research 6, 376–394 (1995)CrossRefGoogle Scholar
  32. 32.
    Strauss, A.L., Corbin, J.M.: Basics of Qualitative Research: Grounded Theory Procedures and Techniques. Sage Publications, Newbury Park (1990)Google Scholar
  33. 33.
    Strauss, A., Corbin, J.: Grounded Theory Methodology - An Overview. In: Denzin, N.K., Lincoln, Y.S. (eds.) Handbook of Qualitative Research, pp. 273–285. Sage Publications, Thousand Oaks (1994)Google Scholar
  34. 34.
    Aksulu, A., Wade, M.: A Comprehensive Review and Synthesis of Open Source Research. Journal of the Association for Information Systems 11, 576–656 (2010)Google Scholar
  35. 35.
    Spinellis, D.: A Tale of Four Kernels. In: 30th International Conference on Software Engineering, ICSE 2008, pp. 381–390. ACM/IEEE, Leipzig, Germany (2008)CrossRefGoogle Scholar
  36. 36.
    Capra, E., Francalanci, C., Merlo, F.: An Empirical Study on the Relationship among Software Design Quality, Development Effort, and Governance in Open Source Projects. IEEE Transactions on Software Engineering 34, 765–782 (2008)CrossRefGoogle Scholar
  37. 37.
    Conley, C.A.: Design for quality: The case of Open Source Software Development. Stern Graduate School of Business Administration, vol. PhD, pp. 43. New York University, New York (2008)Google Scholar
  38. 38.
    Koru, A.G., Tian, J.: Comparing high-change modules and modules with the highest measurement values in two large-scale open-source products. IEEE Transactions on Software Engineering 31, 625–642 (2005)CrossRefGoogle Scholar
  39. 39.
    Gyimothy, T., Ferenc, R., Siket, I.: Empirical validation of object-oriented metrics on open source software for fault prediction. IEEE Transactions on Software Engineering 31, 897–910 (2005)CrossRefGoogle Scholar
  40. 40.
    Koru, A.G., Liu, H.: Identifying and characterizing change-prone classes in two large-scale open-source products. Journal of Systems and Software 80, 63–73 (2007)CrossRefGoogle Scholar
  41. 41.
    Koch, S., Neumann, C.: Exploring the Effects of Process Characteristics on Product Quality in Open Source Software Development. Journal of Database Management 19, 31–57 (2008)CrossRefGoogle Scholar
  42. 42.
    Yu, L., Schach, S.R., Chen, K., Heller, G.Z., Offutt, J.: Maintainability of the kernels of open-source operating systems: A comparison of Linux with FreeBSD, NetBSD, and OpenBSD. Journal of Systems and Software 79, 807–815 (2006)CrossRefGoogle Scholar
  43. 43.
    Barbagallo, D., Francalenei, C., Merlo, F.: The Impact of Social Networking on Software Design Quality and Development Effort in Open Source Projects. In: Proceedings of the International Conference on Information Systems (2008)Google Scholar
  44. 44.
    Samoladas, I., Stamelos, I., Angelis, L., Oikonomou, A.: Open source software development should strive for even greater code maintainability. Commun. ACM 47, 83–87 (2004)CrossRefGoogle Scholar
  45. 45.
    Ghapanchi, A.H., Aurum, A.: Measuring the Effectiveness of the Defect-Fixing Process in Open Source Software Projects. In: Proceedings of the 44th Hawaii International Conference on System Sciences, Hawaii, USA (2011)Google Scholar
  46. 46.
    Kidane, Y., Gloor, P.: Correlating temporal communication patterns of the Eclipse open source community with performance and creativity. Computational & Mathematical Organization Theory 13, 17–27 (2007)MATHGoogle Scholar
  47. 47.
    Au, Y.A., Carpenter, D., Chen, X., Clark, J.G.: Virtual organizational learning in open source software development projects. Information & Management 46, 9–15 (2009)CrossRefGoogle Scholar
  48. 48.
    Crowston, K., Scozzi, B.: Bug fixing practices within free/libre open source software development teams. Journal of Database Management 19, 1–30 (2008)CrossRefGoogle Scholar
  49. 49.
    Koru, A.G., Tian, J.: Defect handling in medium and large open source projects. IEEE Software 21, 54–61 (2004)CrossRefGoogle Scholar
  50. 50.
    Glance, D.G.: Release Criteria for the Linux Kernel. First Monday 9 (2004)Google Scholar
  51. 51.
    Huntley, C.L.: Organizational learning in open-source software projects: an analysis of debugging data. IEEE Transactions on Engineering Management 50, 485–493 (2003)CrossRefGoogle Scholar
  52. 52.
    Sohn, S.Y., Mok, M.S.: A strategic analysis for successful open source software utilization based on a structural equation model. Journal of Systems and Software 81, 1014–1024 (2008)CrossRefGoogle Scholar
  53. 53.
    Zhou, Y., Davis, J.: Open source software reliability model: an empirical approach. In: Proceedings of the Fifth Workshop on Open Source Software Engineering, pp. 1–6. ACM, St. Louis (2005)CrossRefGoogle Scholar
  54. 54.
    Zhao, L., Elbaum, S.: Quality assurance under the open source development model. Journal of Systems and Software 66, 65–75 (2003)Google Scholar
  55. 55.
    Aberdour, M.: Achieving Quality in Open Source Software. IEEE Software 24, 58–64 (2007)CrossRefGoogle Scholar
  56. 56.
    Halloran, T.J., Scherlis, W.L.: High Quality and Open Source Software Practices. In: Proceedings of the 2nd Workshop on Open Source Software Engineering (ICSE 2002), Orlando, FL, USA (2002)Google Scholar
  57. 57.
    Michlmayr, M., Hunt, F., Probert, D.: Quality Practices and Problems in Free Software Projects. In: Scotto, M., Succi, G. (eds.) Proceedings of the First International Conference on Open Source Systems, Genova, Italy, pp. 24–28 (2005)Google Scholar
  58. 58.
    Mockus, A., Fielding, R.T., Herbsleb, J.: A Case Study of Open Source Software Development: The Apache Server. In: Proceedings of the 22nd International Conference on Software Engineering, ICSE (2000)Google Scholar
  59. 59.
    Samoladas, I., Gousios, G., Spinellis, D., Stamelos, I.: The SQO-OSS Quality Model: Measurement Based Open Source Software Evaluation. In: 4th International Conference on Open Source Systems (OSS 2008), Milan, Italy, pp. 237–248 (2008)Google Scholar
  60. 60.
    Deprez, J.-C., Alexandre, S.: Comparing Assessment Methodologies for Free/Open Source Software: OpenBRR and QSOS. In: Jedlitschka, A., Salo, O. (eds.) PROFES 2008. LNCS, vol. 5089, pp. 189–203. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  61. 61.
    Crowston, K., Howison, J., Annabi, H.: Information Systems Success in Free and Open Source Software Development: Theory and Measures. Software Process: Improvement and Practice 11, 123–148 (2006)CrossRefGoogle Scholar
  62. 62.
    Wray, B., Mathieu, R.: Evaluating the performance of open source software projects using data envelopment analysis. Information Management & Computer Security 16, 449 (2008)CrossRefGoogle Scholar
  63. 63.
    del Bianco, V., Lavazza, L., Morasca, S., Taibi, D., Tosi, D.: The QualiSPo approach to OSS product quality evaluation. In: Proceedings of the 3rd International Workshop on Emerging Trends in Free/Libre/Open Source Software Research and Development (FLOSS 2010), pp. 23–28. ACM, Cape Town (2010)CrossRefGoogle Scholar
  64. 64.
    Soto, M., Ciolkowski, M.: The QualOSS open source assessment model measuring the performance of open source communities. In: Proceedings of the 2009 3rd International Symposium on Empirical Software Engineering and Measurement, pp. 498–501 (2009)Google Scholar
  65. 65.
    Deprez, J.-c., Monfils, F.F., Ciolkowski, M., Soto, M.: Defining Software Evolvability from a Free/Open-Source Software Perspective. In: Third International IEEE Workshop on Software Evolvability, pp. 29–35. IEEE, Paris (2007)CrossRefGoogle Scholar
  66. 66.
    Glott, R., Groven, A.-K., Haaland, K., Tannenberg, A.: Quality Models for Free/Libre Open Source Software–Towards the "Silver Bullet"? In: 36th EUROMICRO Conference on Software Engineering and Advanced Applications, Lille, France, pp. 439–446 (2010)Google Scholar
  67. 67.
    Groven, A.-K., Haaland, K., Glott, R., Tannenberg, A.: Security measurements within the framework of quality assessment models for free/libre open source software. In: Proceedings of the Fourth European Conference on Software Architecture: Companion Volume, pp. 229–235. ACM, Copenhagen (2010)CrossRefGoogle Scholar
  68. 68.
    Michlmayr, M.: Software Process Maturity and the Success of Free Software Projects. In: Proceeding of the 2005 Conference on Software Engineering: Evolution and Emerging Technologies, pp. 3–14 (2005)Google Scholar
  69. 69.
    Schweik, C.M., English, R.C., Kitsing, M., Haire, S.: Brooks’ Versus Linus’ Law: An Empirical Test of Open Source Projects. In: Proceedings of the 2008 International Conference on Digital Government Research, Montreal, Canada, pp. 423–424 (2008)Google Scholar
  70. 70.
    Tiwana, A.: The Influence of Software Platform Modularity on Platform Abandonment: An Empirical Study of Firefox Extension Developers. University of Georgia, Terry School of Business (2010) Google Scholar
  71. 71.
    Glinz, M.: On Non-Functional Requirements. In: Proceedings of the 15th IEEE International Requirements Engineering Conference, Delhi, India, pp. 21–26 (2007)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2011

Authors and Affiliations

  • Claudia Ruiz
    • 1
  • William Robinson
    • 1
  1. 1.Computer Information Systems DepartmentGeorgia State UniversityAtlantaUSA

Personalised recommendations