Empirical Software Engineering

, Volume 19, Issue 1, pp 241–266 | Cite as

Investigation of individual factors impacting the effectiveness of requirements inspections: a replicated experiment

Article

Abstract

This paper presents a replication of an empirical study regarding the impact of individual factors on the effectiveness of requirements inspections. Experimental replications are important for verifying results and investigating the generality of empirical studies. We utilized the lab package and procedures from the original study, with some changes and additions, to conduct the replication with 69 professional developers in three different companies in Turkey. In general the results of the replication were consistent with those of the original study. The main result from the original study, which is supported in the replication, was that inspectors whose degree is in a field related to software engineering are less effective during a requirements inspection than inspectors whose degrees are in other fields. In addition, we found that Company, Experience, and English Proficiency impacted inspection effectiveness.

Keywords

Software inspections Software engineering Empirical studies Replication Requirements 

Notes

Acknowledgment

We thank the employees and management of the companies for their participation. We thank Jorge L. Diaz-Herrera, Mustafa Akgül and Duygu Albayrak, Erhan Yüceer, David Davenport for helping the education survey. We thank Natalia Juristo for comments on the paper.

References

  1. Aceituna D, Do H, Walia GS, Lee S-W (2011) Evaluating the use of model-based requirements verification method: A feasibility study. In: Proc. 2011 First International Workshop on Empirical Requirements Engineering (EmpiRE), pp. 13–20Google Scholar
  2. Andersson C (2007) A replicated empirical study of a selection method for software reliability growth models. Empir Softw Eng 12(2):161–182CrossRefGoogle Scholar
  3. Aurum A, Petersson H, Wohlin C (2002) State-of-the-art: software inspections after 25 years. Soft Test Verif Rel 12(3):133–154CrossRefGoogle Scholar
  4. Aurum A, Wohlin C, Petersson H (2005) Increasing the understanding of effectiveness in software inspections using published data sets. J Res Pract Inf Technol 37(3):253–266Google Scholar
  5. Basili VR, Green S, Laitenberger O, Lanubile F, Shull F, Sørumgård S, Zelkowitz MV (1996) The empirical investigation of perspective-based reading. Empir Softw Eng 1(2):133–164CrossRefGoogle Scholar
  6. Biffl S (2000) Analysis of the impact of reading technique and inspector capability on individual inspection performance. In: Proc.7th Asia-Pacific Softw. Eng. Conf, pp. 136–145Google Scholar
  7. Carver J (2003) The Impact of Background and Experience on Software Inspections. PhD Thesis. Dept. of Comp. Sci., Univ. of MD.Google Scholar
  8. Carver J (2004) The impact of background and experience on software inspections. Empir Softw Eng 9(3):259–262CrossRefGoogle Scholar
  9. Carver J, Lemon K (2005) Architecture reading techniques: A feasibility study. In: Proc.4th Int’l Symp. on Emp. Softw. Eng. (Late Breaking Research Track). pp. 17–20Google Scholar
  10. Carver J, Shull F, Basili V (2003) Observational studies to accelerate process experience in classroom studies: An evaluation. In: Proc.2nd Int’l Symp. on Emp. Softw. Eng., pp. 72–79Google Scholar
  11. Carver J, Shull F, Basili VR (2006) Can observational techniques help novices overcome the software inspection learning curve? an empirical investigation. Empir Softw Eng 11(4):523–539CrossRefGoogle Scholar
  12. Carver JC, Nagappan N, Page A (2008) The impact of educational background on the effectiveness of requirements inspections: an empirical study. IEEE Trans Softw Eng 34(6):800–812CrossRefGoogle Scholar
  13. Ciolkowski M (2009) What do we know about perspective-based reading? an approach for quantitative aggregation in software engineering. In: Proc.3rd International Symposium on Empirical Software Engineering and Measurement (ESEM 2009), pp. 133–144.Google Scholar
  14. Dillon A, McKnight C, Richardson J (1988) Reading from paper versus reading from screen. Comput J 31(5):457–464CrossRefGoogle Scholar
  15. Fagan ME (1976) Design and code inspections to reduce errors in program development. IBM Syst J 15(3):182–211CrossRefGoogle Scholar
  16. Fagan ME (1986) Advances in software inspections. IEEE Trans Softw Eng SE-12(7):744–751CrossRefGoogle Scholar
  17. Fraenkel JR, Wallen NE (2006). How to design and evaluate research in education, 6th edn. McGraw-Hill Publishing Company, New YorkGoogle Scholar
  18. Fusaro P, Lanubile F, Visaggio G (1997) A replicated experiment to assess requirements inspection techniques. Empir Softw Eng 2(1):39–57CrossRefGoogle Scholar
  19. Garousi V (2010) Applying peer reviews in software engineering education: an experiment and lessons learned. IEEE Trans Educ 53(2):182–193MathSciNetCrossRefGoogle Scholar
  20. Hungerford BC, Hevner AR, Collins RW (2004) Reviewing software diagrams: a cognitive study. IEEE Trans Softw Eng 30(2):82–96CrossRefGoogle Scholar
  21. Johnson PM, Tjahjono D (1997) Assessing software review meetings: A controlled experimental study using CSRS. In: Proc.9th Int’l Conf. on Softw. Eng, pp. 118–127Google Scholar
  22. Juristo N, Vegas S (2009) Using differences among replications of software engineering experiments to gain knowledge. In: Proc.3rd Int’l Symp. on Emp. Softw. Eng. and Measurement, pp. 356-366Google Scholar
  23. Kitchenham BA (2008) The role of replications in empirical software engineering—a word of warning. Empir Softw Eng 13(2):219–221CrossRefGoogle Scholar
  24. Kollanus S (2009) Experiences from using ICMM in inspection process assessment. Softw Qual J 17(2):177–187CrossRefGoogle Scholar
  25. Kollanus S (2011) ICMM—a maturity model for software inspections. J Softw Maint Evol Res Pract 23(5):327–341CrossRefGoogle Scholar
  26. Kollanus S, Kosnimen J (2009) Survey of software inspection research. The Open Software Engineering Journal 3:15–34Google Scholar
  27. Laitenberger O, DeBaud J (1997) Perspective-based reading of code documents at Robert Bosch GmbH. Inf Softw Technol 39(11):781–791CrossRefGoogle Scholar
  28. Laitenberger O, Atkinson C, Schlich M, El Emam K (2000) An experimental comparison of reading techniques for defect detection in UML design documents. J Syst Softw 53(2):183–204CrossRefGoogle Scholar
  29. Laitenberger O, Emam KE, Harbich TG (2001) An internally replicated quasi-experimental comparison of checklist and perspective-based reading of code documents. IEEE Trans Softw Eng 27(5):387–421CrossRefGoogle Scholar
  30. Land LPW, Wong B, Jeffery R (2003) An extension of the behavioral theory of group performance in software development technical reviews. In: Proc.10th Asia-Pacific Softw. Eng. Conf., pp. 520–530Google Scholar
  31. Lung J, Aranda J, Easterbrook SM, Wilson GV (2008) On the difficulty of replicating human subjects studies in software engineering. In: Proc.30th International Conference on Software Engineering (ICSE), pp. 191–200.Google Scholar
  32. Martin J, Tsai W (1990) N-fold inspection: a requirements analysis technique. Commun ACM 33(2):223–232CrossRefGoogle Scholar
  33. McCarthy P, Porter A, Siy H, LG Votta J (1996) An experiment to assess cost-benefits of inspection meetings and their alternatives: A pilot study. In: Proc. Metrics, pp. 100Google Scholar
  34. McMeekin DA, von Konsky BR, Robey M, Cooper DJA (2009) The significance of participant experience when evaluating software inspection techniques. In: Proc. Australian Software Engineering Conference (ASWEC’09), pp. 200–209Google Scholar
  35. Noyes JM, Garland KJ (2008) Computer- vs. Paper-based tasks: are they equivalent? Ergonomics 51(9):1352–1375CrossRefGoogle Scholar
  36. O’Hara K, Sellen A (1997) A comparison of reading paper and on-line documents. In: Proc. SIGCHI Conf. on Human Factors in Computing Systems, pp. 335–342Google Scholar
  37. Olalekan AS, Adenike OO (2008) Empirical study of factors affecting the effectiveness of software inspection: a preliminary report. Eur J Sci Res 19(4):614–627Google Scholar
  38. Parnas DL, Weiss D (1985) Active design reviews: principles and practice. In: Proc.8th Int’l Conf. on Softw. Eng., pp. 132–136.Google Scholar
  39. Porter A, Votta L, Basili VR (1998) Comparing detection methods for software requirements inspections: a replication using professional subjects. Empir Softw Eng 3(4):355–379CrossRefGoogle Scholar
  40. Regnell B, Runeson P, Thelin T (2000) Are the perspectives really different? Further experimentation on scenario-based reading of requirements. Empir Softw Eng 5(4):331–356MATHCrossRefGoogle Scholar
  41. Robbins B, Carver J (2009) Cognitive factors in perspective-based reading (PBR): A protocol analysis study. In: Proc.3rd International Symposium on Empirical Software Engineering and Metrics. Oct. 15–16, pp. 145–155.Google Scholar
  42. Sandahl K, Blomkvist O, Karlsson J, Krysander C, Lindvall M, Ohlsson N (1998) An extended replication of an experiment for assessing methods for software requirements inspections. Empir Softw Eng 3(4):327–354CrossRefGoogle Scholar
  43. Sauer C, Jeffery DR, Land L, Yetton P (2000) The effectiveness of software development technical reviews: a behaviorally motivated program of research. IEEE Trans Softw Eng 26(1):1–14CrossRefGoogle Scholar
  44. Schneider GM, Martin J, Tsai WT (1992) An experimental study of fault detection in user requirements documents. ACM Trans Softw Eng Methodol 1(2):188–204CrossRefGoogle Scholar
  45. Shull F, Carver J, Travassos G (2001) An empirical methodology for introducing software processes. In: Proc. Joint 8th Eur. Softw. Eng. Conf. and 9th ACM SIGSOFT Foundations of Softw. Eng. Sept. 10–14, 2001, pp. 288–296Google Scholar
  46. Shull F, Basili V, Carver J, Maldonado J, Travassos G, Mendonca M, Fabbri S (2002) Replicating software engineering experiments: Addressing the tacit knowledge problem. In: Proc.1st Int’l Symp. on Emp. Softw. Eng. Oct. 3–4, 2002, pp. 7–16Google Scholar
  47. Shull F, Mendonca M, Basili V, Carver J, Maldonado J, Fabbri S, Travassos G, Ferreira M (2004) Knowledge-sharing issues in experimental software engineering. Empir Softw Eng 9(1):111–137CrossRefGoogle Scholar
  48. Shull F, Carver J, Vegas S, Juristo N (2008) The role of replications in empirical software engineering. Empir Softw Eng 13(2):211–218CrossRefGoogle Scholar
  49. The Joint Task Force on Computing Curricula, IEEE Computer Society, Association for Computing Machinery, Software Engineering (2004) Curriculum Guidelines for Undergraduate Degree Programs in Software Engineering, Retrieved July 19, 2012 from http://sites.computer.org/ccse/SE2004Volume.pdf
  50. Travassos G, Shull F, Fredericks M, Basili V (1999a) Detecting defects in object oriented designs: Using reading techniques to increase software quality. In: Proc. OOPSLA ’99Google Scholar
  51. Travassos G, Shull F, Carver J (1999b) Reading techniques for OO design inspections. In: Proc. 24th NASA Softw. Eng. Wksp.Google Scholar
  52. Votta L (1993) Does every inspection need a meeting? In: Proc. ACM SIGSOFT Symp. on the Foundations of Softw. Eng, pp. 107–114Google Scholar
  53. Winkler D, Thurnher B, Biffl S (2007) Early software product improvement with sequential inspection sessions: An empirical investigation of inspector capability and learning effects. In: Proc.33rd EUROMICRO Conference on Software Engineering and Advanced Applications, pp. 245–254Google Scholar
  54. Winkler D, Biffl S, Faderl K (2010) Investigating the temporal behavior of defect detection in software inspection and inspection-based testing. In: Proc. Product-Focused Software Process Improvement., pp. 17–31Google Scholar
  55. Wong YK (2011) Do developers matter in system review? Behav Inform Technol 30(3):353–378CrossRefGoogle Scholar
  56. Zhang Z, Basili V, Shneiderman B (1999) Perspective-based usability inspection: an empirical validation of efficacy. Empir Softw Eng 4(1):43–70CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2012

Authors and Affiliations

  1. 1.Department of Computer Technology and Information SystemsBilkent UniversityAnkaraTurkey
  2. 2.Department of Computer ScienceUniversity of AlabamaTuscaloosaUSA

Personalised recommendations