Evolving Defect “Folklore”: A Cross-Study Analysis of Software Defect Behavior
Answering “macro-process” research issues – which require understanding how development processes fit or do not fit in different organizational systems and environments – requires families of related studies. While there are many sources of variation between development contexts, it is not clear a priori what specific variables influence the effectiveness of a process in a given context. These variables can only be discovered opportunistically, by comparing process effects from different environments and analyzing points of difference.
In this paper, we illustrate this approach and the conclusions that can be drawn by presenting a family of studies on the subject of software defects and their behaviors – a key phenomenon for understanding macro-process issues. Specifically, we identify common “folklore,” i.e. widely accepted heuristics concerning how defects behave, and then build up a body of knowledge from empirical studies to refine the heuristics with information concerning the conditions under which they do and do not hold.
KeywordsSoftware Engineer Context Variable Software Defect Development Context Interface Defect
Unable to display preview. Download preview PDF.
- 3.Brooks, A., Daly, J., Miller, J., Roper, M., Wood, M.: Replication of experimental results in software engineering. Technical Report ISERN-96-10, Department of Computer Science, University of Strathclyde, Glasgow (1996)Google Scholar
- 4.Lott, C.M., Rombach, H.D.: Repeatable software engineering experiments for comparing defect-detection techniques. Journal of Empirical Software Engineering 1(3) (1996)Google Scholar
- 8.Chillarege, R., Bhandari, I., Chaar, J., Halliday, M., Moebus, D., Ray, B., Wong, M.: Orthogonal Defect Classification: A Concept for In-process Measurements. IEEE Transactions on Software Engineering (November 1992)Google Scholar
- 9.Basili, V.R., Weiss, D.M.: A Methodology for Collecting Valid Software Engineering Data. IEEE Transactions on Software Engineering, 728–738 (November 1984)Google Scholar
- 10.IEEE. Software Engineering Standards. IEEE Computer Society Press, Los Alamitos (1987)Google Scholar
- 11.[Endres75]Google Scholar
- 13.Basili, V.R., Weiss, D.M.: Evaluation of the A-7 Requirements Document by Analysis of Change Date. In: Proceedings of the Fifth International Conference on Software Engineering, March 1981, pp. 314–323 (1981)Google Scholar
- 15.Weiss, D.M., Basili, V.R.: Evaluating Software Development by Analysis of Changes: The Data from the Software Engineering Laboratory. IEEE Transactions on Software Engineering, 157–168 (February 1985)Google Scholar
- 16.Selby, R.W., Basili, V.R.: Analyzing Error Prone System Structure. IEEE Transactions on Software Engineering, 141–152 (February 1991)Google Scholar
- 17.Dangle, K., Hickok, J., Turner, R., Dwinnell, L.: Introducing the Department of Defense Acquisition Best Practices Clearinghouse. CrossTalk, 4–5 (May 2005)Google Scholar