Skip to main content
Log in

Fully employing software inspections data

  • Original Paper
  • Published:
Innovations in Systems and Software Engineering Aims and scope Submit manuscript

Abstract

Software inspections provide a proven approach to quality assurance for software products of all kinds, including requirements, design, code, test plans, among others. Common to all inspections is the aim of finding and fixing defects as early as possible, and thereby providing cost savings by minimizing the amount of rework necessary later in the life cycle. Measurement data, such as the number and type of found defects and the effort spent by the inspection team, provide not only direct feedback about the software product to the project team, but are also valuable for process improvement activities. In this paper, we discuss NASA’s use of software inspections and the rich set of data that has resulted. In particular, we present results from analysis of inspection data that illustrate the benefits of fully utilizing that data for process improvement at several levels. Examining such data across multiple inspections or projects allows team members to monitor and trigger cross project improvements. Such improvements may focus on the software development processes of the whole organization as well as improvements to the applied inspection process itself.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Basili V, Green S, Laitenberger O, Lanubile F, Shull F, Soerumgaard S, Zelkowitz M (1996) The empirical investigation of perspective-based reading. Empir Softw Eng Int J 1(2): 133–164

    Article  Google Scholar 

  2. Ciolkowski C, Differding C, Laitenberger O, Muench J (1997) Empirical investigation of perspective-based reading: a replicated experiment. International Software Engineering Research Network (ISERN), Technical Report ISERN-97-13

  3. Chillarege R, Bhandari IS, Chaar JK, Halliday MJ, Moebus DS, Ray BK, Wong M-Y (1992) Orthogonal defect classification—a concept for in-process measurements. IEEE Trans Softw Eng 18(11): 943–956

    Article  Google Scholar 

  4. Chrissi MB, Konrad M, Shrum S (2007) CMMI®. Guidelines for process integration and product improvement. SEI series in software engineering, 2nd edn. Addison-Wesley, Reading

  5. Conradi R, Mohagheghi P, Arif T, Hegde LC, Bunde GA, Pedersen A (2003) Object-Oriented reading techniques for inspection of UML models—an industrial experiment. In: European conference on object-oriented programming (ECOOP’03), Darmstadt, Germany

  6. Denger C, Shull F (2007) A practical approach for quality-driven inspections. IEEE Softw 24(2): 79–86

    Article  Google Scholar 

  7. Fagan ME (1976) Design and code inspection to reduce errors in program development. IBM Syst J 15(3): 182–211

    Article  Google Scholar 

  8. Gilb T, Graham D (1993) Software inspection. Addison-Wesley, Reading

    Google Scholar 

  9. Kalinowski M, Travassos GH (2004) A computational framework for supporting software inspections. In: Proceedings of the 19th international conference on automated software engineering (ASE04). IEEE Computer Society, Linz, Austria

  10. Kalinowski M, Travassos GH (2004) ISPIS: a framework supporting software inspection processes. In: Proceedings of the 19th international conference on automated software engineering (ASE04). IEEE Computer Society, Linz, Austria

  11. Kelly JC, Sherif JS, Hops J (1992) An analysis of defect densities found during software inspections. J Syst Softw 17(2): 111– 117

    Article  Google Scholar 

  12. Kolkhorst BG (1992) Space shuttle primary onboard software development: process control and defect cause analysis. IBM Corporation Technical Report, Houston, Texas, pp 1–15

  13. Laitenberger O, El Emam K, Harbich T (2000) An internally replicated quasi-experimental comparison of checklist and perspective-based reading of code documents. IEEE Trans Softw Eng 27(5): 387–421

    Article  Google Scholar 

  14. NASA Procedural Requirements 7150.2 (2009) Subject: NASA Software Engineering Requirements. http://nodis3.gsfc.nasa.gov/displayDir.cfm?t=NPR&c=7150&s=2. Accessed 30 Jun 2009

  15. Nick M, Denger C, Willrich T (2005) Experience-based support for code inspections. In: Althoff K-D, Dengel A, Bergmann R, Nick M, Roth-Berghofer T (eds):Professional knowledge management, third biennial conference, WM 2005, Kaiserslautern, Germany, pp 121–126

  16. Porter A, Votta L (1998) Comparing detection methods for software requirement inspections: a replicated experiment using professional subjects. Empir Softw Eng Int J 3(4): 355–379

    Article  Google Scholar 

  17. Pressburger T, Hinchey M, Feather MS, Markosian L (2006) Infusing software engineering technology into practice at NASA. In: 2nd IEEE international conference on space mission challenges for information technology (SMC-IT’06), pp 89–100

  18. Pressburger T, Markosian L (2004) Software engineering research/developer collaborations in 2004 (CI04). Final Report. http://ti.arc.nasa.gov/m/pub/806h/0806%20(Pressburger).pdf. Accessed 30 Jun 2009

  19. Rus I, Shull F, Donzelli P (2003) Decision support for using software inspections. In: 28th Annual NASA Goddard software engineering workshop (SEW’03), 3 pp

  20. Seaman C, Shull F, Regardie M, Elbert D, Feldmann RL, Guo Y, Godfrey S (2008) Defect categorization: making use of a decade of widely varying historical data. In: Proceedings of the second ACM-IEEE international symposium on empirical software engineering and measurement (ESEM08), Kaiserslautern, Germany. The Association for Computing Machinery, New York

  21. Selby RW (1990) Empirically based analysis of failures in software systems. IEEE Trans Reliab 39(4): 444–454

    Article  Google Scholar 

  22. Shull F, Bachman J, Van Voorhis J, Larsen P (2001) Lessons learned report for the ‘state-of-the-art software inspections and reading’ initiative. Deliverable to the NASA OSMA SARP. http://sarpresults.ivv.nasa.gov/DownloadFile/31/11/Lessons%20Learned%20Report.doc

  23. Shull F, Basili VR, Boehm B, Brown AW, Costa P, Lindvall M, Port D, Rus I, Tesoriero R, Zelkowitz MV (2002) What we have learned about fighting defects. In: 8th International software metrics symposium. IEEE, Ottawa, Canada, pp 249–258

  24. Shull F, Rus I, Basili VR (2000) How perspective-based reading can improve requirements inspections. IEEE Comput 33(7): 73–79

    Article  Google Scholar 

  25. Wohlin C, Aurum A, Petersson H, Shull F, Ciolkowski M (2002) Software inspection benchmarking—a qualitative and quantitative comparative opportunity. In: Proceedings IEEE international symposium on software metrics (METRICS02), Ottawa, Canada, pp 118–130

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Forrest Shull.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Shull, F., Feldmann, R.L., Seaman, C. et al. Fully employing software inspections data. Innovations Syst Softw Eng 8, 243–254 (2012). https://doi.org/10.1007/s11334-010-0132-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11334-010-0132-1

Keywords

Navigation