Advertisement

Evolving Defect “Folklore”: A Cross-Study Analysis of Software Defect Behavior

  • Victor Basili
  • Forrest Shull
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3840)

Abstract

Answering “macro-process” research issues – which require understanding how development processes fit or do not fit in different organizational systems and environments – requires families of related studies. While there are many sources of variation between development contexts, it is not clear a priori what specific variables influence the effectiveness of a process in a given context. These variables can only be discovered opportunistically, by comparing process effects from different environments and analyzing points of difference.

In this paper, we illustrate this approach and the conclusions that can be drawn by presenting a family of studies on the subject of software defects and their behaviors – a key phenomenon for understanding macro-process issues. Specifically, we identify common “folklore,” i.e. widely accepted heuristics concerning how defects behave, and then build up a body of knowledge from empirical studies to refine the heuristics with information concerning the conditions under which they do and do not hold.

Keywords

Software Engineer Context Variable Software Defect Development Context Interface Defect 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Basili, V.R., Green, S., Laitenberger, O., Lanubile, F., Shull, F., Sorumgaard, S.L., Zelkowitz, M.V.: The Empirical Investigation of Perspective-based Reading. Empirical Software Engineering, An International Journal 1(2), 133–164 (1996)CrossRefGoogle Scholar
  2. 2.
    Basili, V.R., Shull, F., Lanubile, F.: Building Knowledge through Families of Experiments. IEEE Transactions on Software Engineering 25(4), 456–473 (1999)CrossRefGoogle Scholar
  3. 3.
    Brooks, A., Daly, J., Miller, J., Roper, M., Wood, M.: Replication of experimental results in software engineering. Technical Report ISERN-96-10, Department of Computer Science, University of Strathclyde, Glasgow (1996)Google Scholar
  4. 4.
    Lott, C.M., Rombach, H.D.: Repeatable software engineering experiments for comparing defect-detection techniques. Journal of Empirical Software Engineering 1(3) (1996)Google Scholar
  5. 5.
    Wohlin, C., Runeson, P., Host, M., Ohlsson, M., Regnell, B., Wesslen, A.: Experimentation in Software Engineering: An Introduction. Kluwer Academic Publishers, Boston (2000)zbMATHGoogle Scholar
  6. 6.
    Juristo, N., Moreno, A.M. (eds.): Lecture Notes on Empirical Software Engineering. World Scientific, New Jersey (2003)zbMATHGoogle Scholar
  7. 7.
    Mashiko, Y., Basili, V.R.: Using the GQM Paradigm to Investigate Influential Factors for Software Process Improvement. The Journal of Systems and Software 36(1), 17–32 (1997)CrossRefGoogle Scholar
  8. 8.
    Chillarege, R., Bhandari, I., Chaar, J., Halliday, M., Moebus, D., Ray, B., Wong, M.: Orthogonal Defect Classification: A Concept for In-process Measurements. IEEE Transactions on Software Engineering (November 1992)Google Scholar
  9. 9.
    Basili, V.R., Weiss, D.M.: A Methodology for Collecting Valid Software Engineering Data. IEEE Transactions on Software Engineering, 728–738 (November 1984)Google Scholar
  10. 10.
    IEEE. Software Engineering Standards. IEEE Computer Society Press, Los Alamitos (1987)Google Scholar
  11. 11.
    [Endres75]Google Scholar
  12. 12.
    Weiss, D.: Evaluating Software Development By Error Analysis: The Data from the Architecture Research Facility. J. Systems and Software 1, 57–70 (1979)CrossRefGoogle Scholar
  13. 13.
    Basili, V.R., Weiss, D.M.: Evaluation of the A-7 Requirements Document by Analysis of Change Date. In: Proceedings of the Fifth International Conference on Software Engineering, March 1981, pp. 314–323 (1981)Google Scholar
  14. 14.
    Basili, V.R., Perricone, B.: Software Errors and Complexity: An Empirical Investigation. Communication of the ACM 27(1), 42–52 (1984)CrossRefGoogle Scholar
  15. 15.
    Weiss, D.M., Basili, V.R.: Evaluating Software Development by Analysis of Changes: The Data from the Software Engineering Laboratory. IEEE Transactions on Software Engineering, 157–168 (February 1985)Google Scholar
  16. 16.
    Selby, R.W., Basili, V.R.: Analyzing Error Prone System Structure. IEEE Transactions on Software Engineering, 141–152 (February 1991)Google Scholar
  17. 17.
    Dangle, K., Hickok, J., Turner, R., Dwinnell, L.: Introducing the Department of Defense Acquisition Best Practices Clearinghouse. CrossTalk, 4–5 (May 2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Victor Basili
    • 1
  • Forrest Shull
    • 2
  1. 1.Dept. of Computer ScienceUniversity of MarylandCollege ParkUSA
  2. 2.Fraunhofer Center – MarylandCollege ParkUSA

Personalised recommendations