Empirical Software Engineering

, Volume 9, Issue 1–2, pp 111–137 | Cite as

Knowledge-Sharing Issues in Experimental Software Engineering

  • Forrest Shull
  • Manoel G. Mendoncça
  • Victor Basili
  • Jeffrey Carver
  • José C. Maldonado
  • Sandra Fabbri
  • Guilherme Horta Travassos
  • Maria Cristina Ferreira


Recently the awareness of the importance of replicating studies has been growing in the empirical software engineering community. The results of any one study cannot simply be extrapolated to all environments because there are many uncontrollable sources of variation between different environments.

In our work, we have reasoned that the availability of laboratory packages for experiments can encourage better replications and complementary studies. However, even with effectively specified laboratory packages, transfer of experimental know-how can still be difficult. In this paper, we discuss the collaboration structures we have been using in the Readers’ Project, a bilateral project supported by the Brazilian and American national science agencies that is investigating replications and transfer of experimental know-how issues. In particular, we discuss how these structures map to the Nonaka–Takeuchi knowledge sharing model, a well-known paradigm used in the knowledge management literature. We describe an instantiation of the Nonaka–Takeuchi Model for software engineering experimentation, establishing a framework for discussing knowledge sharing issues related to experimental software engineering. We use two replications to illustrate some of the knowledge sharing issues we have faced and discuss the mechanisms we are using to tackle those issues in Readers’ Project.

Empirical software engineering experimental design experimental replication software reading techniques 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Agresti, W. 2000. Knowledge management. Advances in Computers 53: 171–283.Google Scholar
  2. Amaral, E. A. G. G., and Travassos, G. H. 2003. A package model for software engineering experiments. Proc. of the 2003 ACM-IEEE International Symposium on Empirical Software Engineering ISESE (ISESE 2003). Rome. (To appear-poster session).Google Scholar
  3. Basili, V. R., Green, S., Laitenberger, O., Lanubile, F., Shull, F., Soerumgaard, S., and Zelkowitz, M. V. 1996. The empirical investigation of perspective-based reading. Empirical Software Engineering 1(2): 133–164.Google Scholar
  4. Basili, V. R., Shull, F., and Lanubile, F. 1999. Building knowledge through families of experiments. IEEE Trans. Software Engineering 25(4): 456–473.Google Scholar
  5. Basili, V. R., Tesoriero, R., Costa, P., Lindvall, M., Rus, I., Shull, F., and Zelkowitz, M. V. 2001. Building an experience base for software engineering: A report on the first CeBASE eWorkshop. PROFES (Product Focused Software Process Improvement). Kaiserslautern, Germnay, 110–125.Google Scholar
  6. Basili, V. R., Lindvall, M., and Costa, P. 2000. Implementing the experience factory concepts as a set of experience bases. Proc. The 13th International Conference on Software Engineering & Knowledge Engineering. Buenos Aires, 102-109.Google Scholar
  7. Brooks, A., Daly, J., Miller, J., Roper, M., and Wood, M. 1996. Replication of experimental results in software engineering. ISERN Technical Report ISERN-96-10.Google Scholar
  8. Carver, J. 2003. The impact of background and experience on software inspections. PhD Thesis, Dept. of Computer Science, University of Maryland, April, 2003. (Also available as University of Maryland Department of Computer Science Technical Report CS-TR-#4476.)Google Scholar
  9. Ciolkowski, C., Differding, C., Laitenberger, O., and Muench, J. 1997. Empirical investigation of perspective-based reading: A replicated experiment. ISERN Technical Report ISERN-97-13.Google Scholar
  10. Conradi, R., Basili, V. R., Carver, J., Shull, F., and Travassos, G. H. 2001. A pragmatic documents standard for an experience library: Roles, documents, contents and structure. University of Maryland Technical Report CS-TR-4235.Google Scholar
  11. Dixon, N. M. 2000. Common Knowledge: How Companies Thrive by Sharing what They Know. Boston, MA: Harvard Business School Press.Google Scholar
  12. Goth, G. 2001. New center will help software development ''grow up''. IEEE Software 18(3): 99–102.Google Scholar
  13. Johnson, P. 1996. BRIE: The benchmark inspection experiment. Department of Information and Computer Sciences, University of Hawaii, Technical Report csdl-93-13.Google Scholar
  14. Laitenberger, O., El Emam, K., and Harbich, T. 2001. An internally replicated quasi-experimental comparison of checklist and perspective-based reading of code documents. IEEE Transactions on Software Engineering 27(5): 387–421.Google Scholar
  15. Lindvall, M., Rus, I., and Sinha, S. 2002. Technology support for knowledge management. Proc. 4th International Workshop on Learning Software Organizations (LSO '02), Chicago, Illinois.Google Scholar
  16. Lott, C., and Rombach, D. 1996. Repeatable software engineering experiments for comparing defectdetection techniques. Empirical Software Engineering An International Journal 1(3): 241–277.Google Scholar
  17. Hyperwave, 2003. Scholar
  18. Mendonça, M. G. and Sunderhaft N. L. 1999. A state of the art report: Mining software engineering data. Rome, NY: U.S. Department of Defense (DoD) Data & Analysis Center for Software, also available at Scholar
  19. Mendonça Neto, M. G., Seaman, C., Basili, V. R., and Kim, Y. M. 2001. A prototype experience management system for a software consulting organization. Thirteenth International Conference on Software Engineering and Knowledge Engineering. Buenos Aires, 29-36.Google Scholar
  20. Miller, J. 2000. Applying meta-analytical procedures to software engineering experiments. Journal of Systems and Software 54: 29–39.Google Scholar
  21. Nonaka, I., and Takeuchi, H. 1995. The Knowledge Creating Company. Oxford, UK: Oxford University Press.Google Scholar
  22. Rus, I., Lindvall, M., and Sinha, S. 2001. Knowledge management in software engineering. The Data & Analysis Center for Software (DACS) State-of-the-Art-Report.Google Scholar
  23. Shull, F., Carver, J., and Travassos, G. H. 2001. An empirical methodology for introducing software processes. Proc. 8th European Software Engineering Conference. Vienna, Austria, 288-296.Google Scholar
  24. Shull, F., Basili, V. R., Carver, J., Maldonado, J. C., Travassos, G. H., Mendonça Neto, M. G., and Fabbri, S. C. P. F. 2001. Replicating software engineering experiments: Addressing the tacit knowledge problem. Proc. of the 2002 International Symposium on Empirical Software Engineering (ISESE'2002). Nara, Japan, 7-16.Google Scholar
  25. Wholin, C., Runeson, P., Höst, M., Ohlsson, M. C., Regnell, B., and Wesslen, A. 2000. Experimentation in Software Engineering: An Introduction. Massachusetts: Kluwer Academic Publishers.Google Scholar
  26. Wiig, K. 1999. Comprehensive knowledge management. Knowledge Research Institute, Inc. Working Paper KRI #1999-4 Revision 2.Google Scholar

Copyright information

© Kluwer Academic Publishers 2004

Authors and Affiliations

  • Forrest Shull
    • 1
  • Manoel G. Mendoncça
    • 2
  • Victor Basili
    • 3
  • Jeffrey Carver
    • 4
  • José C. Maldonado
    • 5
  • Sandra Fabbri
    • 6
  • Guilherme Horta Travassos
    • 7
  • Maria Cristina Ferreira
    • 5
  1. 1.Fraunhofer Center for Experimental Software EngineeringCollege Park
  2. 2.Computer Networks Research Group (NUPERC)Salvador University (UNIFACS)Brazil
  3. 3.Fraunhofer Center for Experimental Software Engineering, Department of Computer ScienceUniversity of MarylandCollege Park
  4. 4.Department of Computer ScienceUniversity of MarylandCollege Park
  5. 5.Departmento de Ciências da Computacção e EstatísticaInstituto de Ciências Matemáticas e de ComputacçãoSão Carlos, SPBrazil
  6. 6.Departamento de ComputacçãoUniversidade Federal de São CarlosSão Carlos, SPBrazil
  7. 7.Jardim Guanabara – Ilha do GovernadorRio de Janeiro, RJBrasil

Personalised recommendations