Advertisement

Design and Evaluation of a Process for Identifying Architecture Patterns in Open Source Software

  • Klaas-Jan Stol
  • Paris Avgeriou
  • Muhammad Ali Babar
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6903)

Abstract

Architecture patterns have a direct effect (positive or negative) on a system’s quality attributes (e.g., performance). Therefore, information about patterns used in a product can provide valuable insights to, e.g., component integrators who wish to evaluate a software product. Unfortunately, this information is often not readily available, in particular for Open Source Software (OSS) products, which are increasingly used in component-based development. This paper presents the design and evaluation of a process for Identifying Architecture Patterns in OSS (“IDAPO”). The results of the evaluation suggest that IDAPO is helpful to identify potentially present patterns, and that a process framework may provide better opportunities for tailoring to the users’ needs.

Keywords

architecture patterns quality attributes open source software empirical evaluation quasi-experiment 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bass, L., Clements, P., Kazman, R.: Software Architecture in Practice, 2nd edn. Addison-Wesley, Reading (2003)Google Scholar
  2. 2.
    Buschmann, F., Meunier, R., Rohnert, H., Sommerlad, P., Stal, M.: Pattern-oriented Software Architecture - A System of Patterns. J. Wiley and Sons Ltd., Chichester (1996)Google Scholar
  3. 3.
    Kazman, R., Klein, M., Barbacci, M., Longstaff, T.: The Architecture Tradeoff Analysis Method. In: ICECCS, pp. 68–78 (1998)Google Scholar
  4. 4.
    Harrison, N.B., Avgeriou, P.: Pattern-Based Architecture Reviews. IEEE Software (2011) (in Press) Google Scholar
  5. 5.
    Hauge, Ø., Ayala, C., Conradi, R.: Adoption of Open Source Software in Software-Intensive Organizations - A Systematic Literature Review. Inf. Softw. Technol. 52(11), 1133–1154 (2010)CrossRefGoogle Scholar
  6. 6.
    Hauge, Ø., Sørensen, C.-F., Røsdal, A.: Surveying Industrial Roles in Open Source Software Development. In: Int’l Conf. on Open Source Systems, pp. 259–264 (2007)Google Scholar
  7. 7.
    Ruffin, C., Ebert, C.: Using open source software in product development: A primer. IEEE Software 21(1), 82–86 (2004)CrossRefGoogle Scholar
  8. 8.
    Stol, K., Ali Babar, M.: Challenges in Using Open Source Software in Product Development: A Review of the Literature. In: 3rd FLOSS Workshop, ICSE 2010, pp. 17–22 (2010)Google Scholar
  9. 9.
    Dong, J., Zhao, Y., Peng, T.: Architecture and Design Pattern Discovery Techniques - A Review. In: Int. Conf. Softw. Eng. Research and Practice, pp. 621–627 (2007)Google Scholar
  10. 10.
    Tonella, P., Torchiano, M., du Bois, B., Tarja, S.: Empirical studies in reverse engineering: state of the art and future trends. Empir. Software Eng. 12(5) (2007)Google Scholar
  11. 11.
    Stol, K., Avgeriou, P., Ali Babar, M.: Identifying Architectural Patterns Used in Open Source Software: Approaches and Challenges. In: EASE, Keele, UK (2010)Google Scholar
  12. 12.
    Shaw, M., Garlan, D.: Software Architecture: Perspectives on an Emerging Discpline. Prentice-Hall Inc., Englewood Cliffs (1996)zbMATHGoogle Scholar
  13. 13.
    Harrison, N.B., Avgeriou, P.: Leveraging Architecture Patterns to Satisfy Quality Attributes. In: Oquendo, F. (ed.) ECSA 2007. LNCS, vol. 4758, pp. 263–270. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  14. 14.
    Stol, K., Ali Babar, M.: A Comparison Framework for Open Source Software Evaluation Methods. In: Int’l Conf. on Open Source Systems, pp. 389–394 (2010)Google Scholar
  15. 15.
    Fitzgerald, B.: The transformation of open source software. MISQ 30(3) (2006)Google Scholar
  16. 16.
    Gamma, E., Helm, R., Johnson, R., Vlissides, J.: Design patterns: Elements of Reusable Object-Oriented Software. Addison-Wesley, Reading (1995)zbMATHGoogle Scholar
  17. 17.
    Medvidovic, N., Taylor, R.N.: A classification and comparison framework for software architecture description languages. Trans. Softw. Eng. 26(1), 70–93 (2000)CrossRefGoogle Scholar
  18. 18.
    Seaman, C.B.: Qualitative methods in empirical studies of software engineering. Trans. Softw. Eng. 25(4), 557–572 (1999)CrossRefGoogle Scholar
  19. 19.
    White, S.A.: Introduction to BPMN, BPTrends (July 2004)Google Scholar
  20. 20.
  21. 21.
    Booch, G.: http://www.handbookofsoftwarearchitecture.com (accessed December 5, 2010)
  22. 22.
    Falessi, D., Ali Babar, M., Cantone, G., Kruchten, P.: Applying empirical software engineering to software architecture: challenges and lessons learned. Empir. Software Eng. 15(3), 250–276 (2010)CrossRefGoogle Scholar
  23. 23.
    Wohlin, C., Höst, M., Henningsson, K.: Empirical Methods and Studies in Software Engineering. LNCS, pp. 145–165 (2008)Google Scholar
  24. 24.
    Jedlitschka, A., Pfahl, D.: Reporting Guidelines for Controlled Experiments in Software Engineering. In: ISESE, pp. 95–104 (2005)Google Scholar
  25. 25.
    Olson, D.L., Delen, D.: Advanced Data Mining Techniques. Springer, Heidelberg (2008)zbMATHGoogle Scholar
  26. 26.
    Carver, J.C., Jaccheri, L., Morasca, S., Shull, F.: A checklist for integrating student empirical studies with research and teaching goals. Empir. Software Eng. 15(1) (2010)Google Scholar
  27. 27.
    Kampenes, V.B., Dybå, T., Hannay, J.E., Sjøberg, D.I.K.: A Systematic Review of Quasi-Experiments in Software Engineering. Inf. Softw. Technol. 51(1), 71–82 (2007)CrossRefGoogle Scholar
  28. 28.
    Kitchenham, B.A., Pfleeger, S.L., Pickard, L.M., Jones, P.W., Hoaglin, D.C., el Emam, K., Rosenberg, J.: Preliminary guidelines for empirical research in software engineering. Trans. Softw. Eng. 28(8), 721–734 (2002)CrossRefGoogle Scholar
  29. 29.
    Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., Wesslén, A.: Experimentation in Software Engineering: An Introduction. Kluwer Academic, Dordrecht (2000)CrossRefzbMATHGoogle Scholar
  30. 30.
    Liu, J.: Research Project: An Analysis of JBoss Architecture, http://www.huihoo.org/jboss/jboss.html (accessed March 2, 2011)
  31. 31.
    Salehie, M., Li, S., Tahvildari, L.: ’Architectural Recovery of JBoss Application Server’, Tech. Report no. UW-E&CE#2005-02, University of Waterloo (2005), http://stargroup.uwaterloo.ca/~s7li/publications/ieee_papers/uw-tr-1.pdf
  32. 32.
    Hollander, M., Wolfe, D.A.: Nonparametric statistical methods, 2nd edn. John Wiley & Sons, Inc., Chichester (1999)zbMATHGoogle Scholar
  33. 33.
    Höst, M., Regnell, B., Wohlin, C.: Using Students as Subjects—A Comparative Study of Students and Professionals in Lead-Time Impact Assessment. Empir. Software Eng. 5(3), 201–214 (2000)CrossRefzbMATHGoogle Scholar
  34. 34.
    Svahnberg, M., Aurum, A.K., Wohlin, C.: Using Students as Subjects – An Empirical Evaluation. In: ESEM, Kaiserslautern, Germany, pp. 288–290 (2008)Google Scholar
  35. 35.
    Berander, P.: Using students as subjects in requirements prioritization. In: ISESE (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Klaas-Jan Stol
    • 1
  • Paris Avgeriou
    • 2
  • Muhammad Ali Babar
    • 3
  1. 1.Lero—The Irish Software Engineering Research CentreUniversity of LimerickIreland
  2. 2.University of GroningenThe Netherlands
  3. 3.IT University of CopenhagenDenmark

Personalised recommendations