Advertisement

A Comparison Framework for Open Source Software Evaluation Methods

  • Klaas-Jan Stol
  • Muhammad Ali Babar
Part of the IFIP Advances in Information and Communication Technology book series (IFIPAICT, volume 319)

Abstract

The use of Open Source Software (OSS) components has become a viable alternative to Commercial Off-The-Shelf (COTS) components in product development. Since the quality of OSS products varies widely, both industry and the research community have reported several OSS evaluation methods that are tailored to the specific characteristics of OSS. We have performed a systematic identification of these methods, and present a comparison framework to compare these methods.

Keywords

open source software evaluation method comparison framework 

References

  1. 1.
    Hauge, Ø., Sørensen, C.-F., Conradi, R.: Adoption of Open Source in the Software Industry. In: Proc. Fourth IFIP WG 2.13 International Conference on Open Source Systems (OSS 2008), Milano, Italy, September 7-10, pp. 211–221 (2008)Google Scholar
  2. 2.
    Fitzgerald, B.: A Critical Look at Open Source. Computer 37(7), 92–94 (2004)CrossRefGoogle Scholar
  3. 3.
    Maki-Asiala, P., Matinlassi, M.: Quality Assurance of Open Source Components: Integrator Point of View. In: 30th Annual International Computer Software and Applications Conference, 2006. COMPSAC 2006, pp. 189–194 (2006)Google Scholar
  4. 4.
    Li, J., Conradi, R., Slyngstad, O.P.N., Bunse, C., Torchiano, M., Morisio, M.: Development with Off-the-Shelf Components: 10 Facts. IEEE Software 26(2) (2009)Google Scholar
  5. 5.
    Hauge, Ø., Osterlie, T., Sorensen, C.-F., Gerea, M.: An Empirical Study on Selection of Open Source Software - Preliminary Results. In: Proc. ICSE Workshop on Emerging Trends in FLOSS Research (FLOSS 2009), Vancouver, Canada (2009)Google Scholar
  6. 6.
    Wheeler, D.A.: How to Evaluate Open Source Software / Free Software (OSS/FS) Programs, http://www.dwheeler.com/oss_fs_eval.html (accessed September 8, 2009)
  7. 7.
    Deprez, J.C., Alexandre, S.: Comparing assessment methodologies for free/open source software: OpenBRR and QSOS. In: Jedlitschka, A., Salo, O. (eds.) PROFES 2008. LNCS, vol. 5089, pp. 189–203. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  8. 8.
    Stol, K., Ali Babar, M.: Reporting Empirical Research in Open Source Software: The State of Practice. In: Proc. 5th IFIP WG 2.13 International Conference on Open Source Systems, Skövde, Sweden, June 3-6, pp. 156–169 (2009)Google Scholar
  9. 9.
    Duijnhouwer, F., Widdows, C.: Open Source Maturity Model. Capgemini Expert Letter (2003)Google Scholar
  10. 10.
    Koponen, T., Hotti, V.: Evaluation framework for open source software. In: Proc. Software Engineering and Practice (SERP), Las Vegas, Nevada, USA, June 21-24 (2004)Google Scholar
  11. 11.
    Polančič, G., Horvat, R.V.: A Model for Comparative Assessment Of Open Source Products. In: Proc. The 8th World Multi-Conference on Systemics, Cybernetics and Informatics, Orlando, USA (2004)Google Scholar
  12. 12.
    Polančič, G., Horvat, R.V., Rozman, T.: Comparative assessment of open source software using easy accessible data. In: Proc. 26th International Conference on Information Technology Interfaces, Cavtat, Croatia, June 7-10, pp. 673–678 (2004)Google Scholar
  13. 13.
    Golden, B.: Succeeding with Open Source. Addison-Wesley, Reading (2004)Google Scholar
  14. 14.
    Woods, D., Guliani, G.: Open Source for the Enterprise: Managing Risks Reaping Rewards. O’Reilly Media, Inc., Sebastopol (2005)Google Scholar
  15. 15.
    Business Readiness Rating for Open Source, RFC 1 (2005), www.openbrr.org
  16. 16.
    Wasserman, A.I., Pal, M., Chan, C.: The Business Readiness Rating: a Framework for Evaluating Open Source, Technical Report (2006)Google Scholar
  17. 17.
    Atos Origin: Method for Qualification and Selection of Open Source software (QSOS) version 1.6, Technical Report (2006)Google Scholar
  18. 18.
    Cruz, D., Wieland, T., Ziegler, A.: Evaluation criteria for free/open source software products based on project analysis. Software Process: Improvement and Practice 11(2) (2006)Google Scholar
  19. 19.
    Sung, W.J., Kim, J.H., Rhew, S.Y.: A Quality Model for Open Source Software Selection. In: Proc. Sixth International Conference on Advanced Language Processing and Web Information Technology, Luoyang, Henan, China, pp. 515–519 (2007)Google Scholar
  20. 20.
    Lee, Y.M., Kim, J.B., Choi, I.W., Rhew, S.Y.: A Study on Selection Process of Open Source Software. In: Proc. Sixth International Conference on Advanced Language Processing and Web Information Technology (ALPIT), Luoyang, Henan, China (2007)Google Scholar
  21. 21.
    Cabano, M., Monti, C., Piancastelli, G.: Context-Dependent Evaluation Methodology for Open Source Software. In: Proc. Third IFIP WG 2.13 International Conference on Open Source Systems (OSS 2007), Limerick, Ireland, pp. 301–306 (2007)Google Scholar
  22. 22.
    Assessment of the degree of maturity of Open Source open source software, http://www.oitos.it/opencms/opencms/oitos/Valutazione_di_prodotti/Modello1.2.pdf
  23. 23.
    Ardagna, C.A., Damiani, E., Frati, F.: FOCSE: An OWA-based Evaluation Framework for OS Adoption in Critical Environments. In: Proc. Third IFIP WG 2.13 International Conference on Open Source Systems, Limerick, Ireland, pp. 3–16 (2007)Google Scholar
  24. 24.
    Lavazza, L.: Beyond Total Cost of Ownership: Applying Balanced Scorecards to Open-Source Software. In: Proc. International Conference on Software Engineering Advances (ICSEA) Cap Esterel, French Riviera, France, p. 74 (2007)Google Scholar
  25. 25.
    Taibi, D., Lavazza, L., Morasca, S.: OpenBQR: a framework for the assessment of OSS. In: Proc. Third IFIP WG 2.13 International Conference on Open Source Systems (OSS 2007), Limerick, Ireland, pp. 173–186 (2007)Google Scholar
  26. 26.
    Carbon, R., Ciolkowski, M., Heidrich, J., John, I., Muthig, D.: Evaluating Open Source Software through Prototyping. In: St.Amant, K., Still, B. (eds.) Handbook of Research on Open Source Software: Technological, Economic, and Social Perspectives (Information Science Reference, 2007), pp. 269–281 (2007)Google Scholar
  27. 27.
    Ciolkowski, M., Soto, M.: Towards a Comprehensive Approach for Assessing Open Source Projects. In: Software Process and Product Measurement. Springer, Heidelberg (2008)Google Scholar
  28. 28.
    Samoladas, I., Gousios, G., Spinellis, D., Stamelos, I.: The SQO-OSS Quality Model: Measurement Based Open Source Software Evaluation. In: Proc. Fourth IFIP WG 2.13 International Conference on Open Source Systems (OSS 2008), Milano, Italy (2008)Google Scholar
  29. 29.
    Majchrowski, A., Deprez, J.: An operational approach for selecting open source components in a software development project. In: Proc. 15th European Conference, Software Process Improvement (EuroSPI), Dublin, Ireland, September 3-5 (2008)Google Scholar
  30. 30.
    del Bianco, V., Lavazza, L., Morasca, S., Taibi, D.: Quality of Open Source Software: The QualiPSo Trustworthiness Model. In: Proc. Fifth IFIP WG 2.13 International Conference on Open Source Systems (OSS 2009), Skövde, Sweden, June 3-6 (2009)Google Scholar
  31. 31.
    del Bianco, V., Lavazza, L., Morasca, S., Taibi, D.: The observed characteristics and relevant factors used for assessing the trustworthiness of OSS products and artefacts, Technical Report no. A5.D1.5.3 (2008)Google Scholar
  32. 32.
    Petrinja, E., Nambakam, R., Sillitti, A.: Introducing the OpenSource Maturity Model. In: Proc. ICSE Workshop on Emerging Trends in Free/Libre/Open Source Software Research and Development (FLOSS 2009), Vancouver, Canada, pp. 37–41 (2009)Google Scholar
  33. 33.
    Jayaratna, N.: Understanding and Evaluating Methodologies: NIMSAD, a Systematic Framework. McGraw-Hill, Inc., New York (1994)Google Scholar
  34. 34.
    Ali Babar, M., Gorton, I.: Comparison of Scenario-Based Software Architecture Evaluation Methods. In: Proc. 11th Asia-Pacific Software Engineering Conference (APSEC 2004), Busan, Korea, November 30-December 3, pp. 600–607 (2004)Google Scholar
  35. 35.
    Forsell, M., Halttunen, V., Ahonen, J.: Evaluation of Component-Based Software Development Methodologies. In: Proc. Fenno-Ugric Symposium on Software Technology, Tallin, Estonia, pp. 53–63 (1999)Google Scholar
  36. 36.
    Matinlassi, M.: Comparison of software product line architecture design methods: COPA, FAST, FORM, KobrA and QADA. In: Proc. 26th International Conference on Software Engineering (ICSE), Edingburgh, Scotland, United Kingdom, May 23-28 (2004)Google Scholar

Copyright information

© IFIP 2010

Authors and Affiliations

  • Klaas-Jan Stol
    • 1
  • Muhammad Ali Babar
    • 2
  1. 1.Lero—University of LimerickLimerickIreland
  2. 2.IT University of CopenhagenDenmark

Personalised recommendations