Advertisement

Which Defect Should Be Fixed First? Semantic Prioritization of Static Analysis Report

  • Han WangEmail author
  • Min Zhou
  • Xi Cheng
  • Guang Chen
  • Ming Gu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11293)

Abstract

The usability of static analyzers is plagued by excessive false alarms. It is laborious yet error-prone to manually examine the spuriousness of defect reports. Moreover, the inability to preclude overwhelming false alarms deters user’s confidence on such tools and severely limits their adoption in development cycles. In this paper, we propose a semantic approach for prioritizing defect reports emitted by static analysis. Our approach evaluates the importance of defect reports by their fatality and priorities defects by their affection to critical functions. Compared to the existing approaches that prioritize defect reports by analyzing external attributes, ours substantially utilizes semantic information derived by static analysis to measure the severity of defect reports more precisely. We have implemented a prototype which is evaluated to real-world code bases, and the results show that our approach can effectively evaluate the severity of defects.

Keywords

Defect prioritization Static analysis Defect propagation analysis 

References

  1. 1.
    Abdelmoez, W., Kholief, M., Elsalmy, F.M.: Bug fix-time prediction model using naïve bayes classifier. In: 2012 22nd International Conference on Computer Theory and Applications (ICCTA), pp. 167–172. IEEE (2012)Google Scholar
  2. 2.
    Beyer, D., Henzinger, T.A., Théoduloz, G.: Configurable software verification: concretizing the convergence of model checking and program analysis. In: Damm, W., Hermanns, H. (eds.) CAV 2007. LNCS, vol. 4590, pp. 504–518. Springer, Heidelberg (2007).  https://doi.org/10.1007/978-3-540-73368-3_51CrossRefzbMATHGoogle Scholar
  3. 3.
    Cheng, B.C., Hwu, W.M.W.: Modular interprocedural pointer analysis using access paths: design, implementation, and evaluation. ACM Sigplan Not. 35(5), 57–69 (2000)CrossRefGoogle Scholar
  4. 4.
    Cytron, R., Ferrante, J., Rosen, B.K., Wegman, M.N., Zadeck, F.K.: An efficient method of computing static single assignment form. In: Proceedings of the 16th ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages, pp. 25–35. ACM (1989)Google Scholar
  5. 5.
    Guo, P.J., Zimmermann, T., Nagappan, N., Murphy, B.: Characterizing and predicting which bugs get fixed: an empirical study of microsoft windows. In: 2010 ACM/IEEE 32nd International Conference on Software Engineering, vol. 1, pp. 495–504. IEEE (2010)Google Scholar
  6. 6.
    Jeong, G., Kim, S., Zimmermann, T.: Improving bug triage with bug tossing graphs. In: Proceedings of the 7th Joint Meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on The Foundations of Software Engineering, pp. 111–120. ACM (2009)Google Scholar
  7. 7.
    Jureczko, M., Madeyski, L.: Towards identifying software project clusters with regard to defect prediction. In: Proceedings of the 6th International Conference on Predictive Models in Software Engineering, p. 9. ACM (2010)Google Scholar
  8. 8.
    Kanwal, J., Maqbool, O.: Managing open bug repositories through bug report prioritization using SVMs. In: Proceedings of the International Conference on Open-Source Systems and Technologies, Lahore, Pakistan (2010)Google Scholar
  9. 9.
    Kaushik, N., Amoui, M., Tahvildari, L., Liu, W., Li, S.: Defect prioritization in the software industry: challenges and opportunities. In: 2013 IEEE Sixth International Conference on Software Testing, Verification and Validation (ICST), pp. 70–73. IEEE (2013)Google Scholar
  10. 10.
    Kremenek, T., Engler, D.: Z-Ranking: using statistical analysis to counter the impact of static analysis approximations. In: Cousot, R. (ed.) SAS 2003. LNCS, vol. 2694, pp. 295–315. Springer, Heidelberg (2003).  https://doi.org/10.1007/3-540-44898-5_16CrossRefzbMATHGoogle Scholar
  11. 11.
    Lamkanfi, A., Demeyer, S., Giger, E., Goethals, B.: Predicting the severity of a reported bug. In: 2010 7th IEEE Working Conference on Mining Software Repositories (MSR), pp. 1–10. IEEE (2010)Google Scholar
  12. 12.
    Liang, G., Wu, Q., Wang, Q., Mei, H.: An effective defect detection and warning prioritization approach for resource leaks. In: 2012 IEEE 36th Annual Computer Software and Applications Conference (COMPSAC), pp. 119–128. IEEE (2012)Google Scholar
  13. 13.
    Ruthruff, J.R., Penix, J., Morgenthaler, J.D., Elbaum, S., Rothermel, G.: Predicting accurate and actionable static analysis warnings: an experimental approach. In: Proceedings of the 30th International Conference on Software Engineering, pp. 341–350. ACM (2008)Google Scholar
  14. 14.
    Uddin, J., Ghazali, R., Deris, M.M., Naseem, R., Shah, H.: A survey on bug prioritization. Artif. Intell. Rev. 47(2), 145–180 (2017)CrossRefGoogle Scholar
  15. 15.
    Watanabe, S., Kaiya, H., Kaijiri, K.: Adapting a fault prediction model to allow inter languagereuse. In: Proceedings of the 4th International Workshop on Predictor Models in Software Engineering, pp. 19–24. ACM (2008)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Han Wang
    • 1
    Email author
  • Min Zhou
    • 1
  • Xi Cheng
    • 1
  • Guang Chen
    • 1
  • Ming Gu
    • 1
  1. 1.School of SoftwareTsinghua UniversityBeijingChina

Personalised recommendations