Advertisement

An Integrated Approach to Detect Fault-Prone Modules Using Complexity and Text Feature Metrics

  • Osamu Mizuno
  • Hideaki Hata
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6059)

Abstract

Early detection of fault-prone products is necessary to assure the quality of software product. Therefore, fault-prone module detection is one of the major and traditional area of software engineering. Although there are many approaches to detect fault-prone modules, they have their own pros and cons. Consequently, it is recommended to use appropriate approach on the various situations. This paper tries to show an integrated approach using two different fault-prone module detection approaches.

To do so, we prepare two approaches of fault-prone module detection: a text feature metrics based approach using naive Bayes classifier and a complexity metrics based approach using logistic regression. The former one is proposed by us and the latter one is widely used approach. For the data for application, we used data obtained from Eclipse, which is publicly available.

From the result of pre-experiment, we find that each approach has the pros and cons. That is, the text feature based approach has high recall, and complexity metrics based approach has high precision. In order to use their merits effectively, we proposed an integrated approach to apply these two approaches for fault-prone module detection. The result of experiment shows that the proposed approach shows better accuracy than each approach.

Keywords

Complexity Metrics Text Feature Fault Prediction Extract Text Feature Defect Prediction Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Briand, L.C., Melo, W.L., Wust, J.: Assessing the applicability of fault-proneness models across object-oriented software projects. IEEE Trans. on Software Engineering 28(7), 706–720 (2002)CrossRefGoogle Scholar
  2. 2.
    Khoshgoftaar, T.M., Seliya, N.: Comparative assessment of software quality classification techniques: An empirical study. Empirical Software Engineering 9, 229–257 (2004)CrossRefGoogle Scholar
  3. 3.
    Bellini, P., Bruno, I., Nesi, P., Rogai, D.: Comparing fault-proneness estimation models. In: Proc. of 10th IEEE International Conference on Engineering of Complex Computer Systems, pp. 205–214 (2005)Google Scholar
  4. 4.
    Menzies, T., Greenwald, J., Frank, A.: Data mining static code attributes to learn defect predictors. IEEE Trans. on Software Engineering 33(1), 2–13 (2007)CrossRefGoogle Scholar
  5. 5.
    Catal, C., Diri, B.: Review: A systematic review of software fault prediction studies. Expert Syst. Appl. 36(4), 7346–7354 (2009)CrossRefGoogle Scholar
  6. 6.
    Layman, L., Kudrjavets, G., Nagappan, N.: Iterative identification of fault-prone binaries using in-process metrics. In: Proc. of 2nd International Conference on Empirical Software Engineering and Measurement, September 2008, pp. 206–212 (2008)Google Scholar
  7. 7.
    Kim, S., Pan, K., Whitehead Jr., E.E.J.: Memories of bug fixes. In: Proc. of 14th ACM SIGSOFT international symposium on Foundations of software engineering, pp. 35–45. ACM, New York (2006)CrossRefGoogle Scholar
  8. 8.
    Mizuno, O., Kikuno, T.: Training on errors experiment to detect fault-prone software modules by spam filter. In: Proc. of 6th joint meeting of the European software engineering conference and the ACM SIGSOFT symposium on the foundations of software engineering, pp. 405–414 (2007)Google Scholar
  9. 9.
    Hata, H., Mizuno, O., Kikuno, T.: Fault-prone module detection using large-scale text features based on spam filtering. Empirical Software Engineering (September 2009), doi:10.1007/s10664–009–9117–9Google Scholar
  10. 10.
    Zimmermann, T., Premrai, R., Zeller, A.: Predicting defects for eclipse. In: Proc. of 3rd International Workshop on Predictor models in Software Engineering (2007)Google Scholar
  11. 11.
    Boetticher, G., Menzies, T., Ostrand, T.: PROMISE Repository of empirical software engineering data repository, West Virginia University, Department of Computer Science (2007), http://promisedata.org/
  12. 12.
    Śliwerski, J., Zimmermann, T., Zeller, A.: When do changes induce fixes? (on Fridays). In: Proc. of 2nd International workshop on Mining software repositories, pp. 24–28 (2005)Google Scholar
  13. 13.
    Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques, 2nd edn. Morgan Kaufmann, San Francisco (2005)zbMATHGoogle Scholar
  14. 14.
    Basili, V.R., Briand, L.C., Melo, W.L.: A validation of object oriented metrics as quality indicators. IEEE Trans. on Software Engineering 22(10), 751–761 (1996)CrossRefGoogle Scholar
  15. 15.
    Briand, L.C., Basili, V.R., Thomas, W.M.: A pattern recognition approach for software engineering data analysis. IEEE Trans. on Software Engineering 18(11), 931–942 (1992)CrossRefGoogle Scholar
  16. 16.
    Munson, J.C., Khoshgoftaar, T.M.: The detection of fault-prone programs. IEEE Trans. on Software Engineering 18(5), 423–433 (1992)CrossRefGoogle Scholar
  17. 17.
    Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., Wesslén, A.: Experimentation in software engineering: An introduction. Kluwer Academic Publishers, Dordrecht (2000)zbMATHGoogle Scholar
  18. 18.
    Fenton, N.E., Neil, M.: A critique of software defect prediction models. IEEE Trans. on Software Engineering 25(5), 675–689 (1999)CrossRefGoogle Scholar
  19. 19.
    Gray, A.R., McDonell, S.G.: Software metrics data analysis - exploring the relative performance of some commonly used modeling techniques. Empirical Software Engineering 4, 297–316 (1999)CrossRefGoogle Scholar
  20. 20.
    Takabayashi, S., Monden, A., Sato, S., Matsumoto, K., Inoue, K., Torii, K.: The detection of fault-prone program using a neural network. In: Proc. of International Symposium on Future Software Technology, Nanjing, October 1999, pp. 81–86 (1999)Google Scholar
  21. 21.
    Khoshgoftaar, T.M., Gao, K., Szabo, R.M.: An application of zero-inflated poisson regression for software fault prediction. In: Proc. of 12th International Symposium on Software Reliability Engineering, pp. 66–73 (1999)Google Scholar
  22. 22.
    Khoshgoftaar, T.M., Allen, E.B.: Modeling software quality with classification trees. Recent Advances in Reliability and Quality Engineering, 247–270 (1999)Google Scholar
  23. 23.
    Ohlsson, N., Alberg, H.: Predicting fault-prone software modules in telephone switches. IEEE Trans. on Software Engineering 22(12), 886–894 (1996)CrossRefGoogle Scholar
  24. 24.
    Pighin, M., Zamolo, R.: A predictive metric based on statistical analysis. In: Proc. of 19th International Conference on Software Engineering, pp. 262–270 (1997)Google Scholar
  25. 25.
    Stoerzer, M., Ryder, B.G., Ren, X., Tip, F.: Finding failure-inducing changes in java programs using change classification. In: Proc. of 14th ACM SIGSOFT International Symposium on Foundations of Software Engineering, pp. 57–68. ACM Press, New York (2006)CrossRefGoogle Scholar
  26. 26.
    Hassan, A.E., Holt, R.C.: The top ten list: Dynamic fault prediction. In: Proc. of 21st IEEE International Conference on Software Maintenance, Washington, DC, USA, pp. 263–272. IEEE Computer Society, Los Alamitos (2005)Google Scholar
  27. 27.
    Kim, S., Zimmermann, T., Whitehead Jr., E.J., Zeller, A.: Predicting faults from cached history. In: Proc. of 29th International Conference on Software Engineering, Washington, DC, USA, pp. 489–498. IEEE Computer Society, Los Alamitos (2007)Google Scholar
  28. 28.
    Ratzinger, J., Sigmund, T., Gall, H.: On the relation of refactorings and software defect prediction. In: Proc. of 5th International workshop on Mining software repositories, pp. 35–38. ACM, New York (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Osamu Mizuno
    • 1
  • Hideaki Hata
    • 2
  1. 1.Kyoto Institute of TechnologyKyotoJapan
  2. 2.Osaka UniversityOsakaJapan

Personalised recommendations