Advertisement

Reducing Static Analysis Alarms Based on Non-impacting Control Dependencies

  • Tukaram MuskeEmail author
  • Rohith Talluri
  • Alexander Serebrenik
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11893)

Abstract

Static analysis tools help to detect programming errors but generate a large number of alarms. Repositioning of alarms is recently proposed technique to reduce the number of alarms by replacing a group of similar alarms with a small number of newly created representative alarms. However, the technique fails to replace a group of similar alarms with a fewer representative alarms mainly when the immediately enclosing conditional statements of the alarms are different and not nested. This limitation is due to conservative assumption that a conditional statement of an alarm may prevent the alarm from being an error.

To address the limitation above, we introduce the notion of non-impacting control dependencies (NCDs). An NCD of an alarm is a transitive control dependency of the alarm’s program point, that does not affect whether the alarm is an error. We approximate the computation of NCDs based on the alarms that are similar, and then reposition the similar alarms by considering the effect of their NCDs. The NCD-based repositioning allows to merge more similar alarms together and represent them by a small number of representative alarms than the state-of-the-art repositioning technique. Thus, it can be expected to further reduce the number of alarms.

To measure the reduction obtained, we evaluate the NCD-based repositioning using total 105,546 alarms generated on 16 open source C applications, 11 industry C applications, and 5 industry COBOL applications. The evaluation results indicate that, compared to the state-of-the-art repositioning technique, the NCD-based repositioning reduces the number of alarms respectively by up to 23.57%, 29.77%, and 36.09%. The median reductions are 9.02%, 17.18%, and 28.61%, respectively.

References

  1. 1.
    Allen, F.E.: Control flow analysis. In: Symposium on Compiler Optimization, pp. 1–19. ACM, New York (1970)Google Scholar
  2. 2.
    Ayewah, N., Pugh, W.: The Google FindBugs fixit. In: International Symposium on Software Testing and Analysis, pp. 241–252. ACM, New York (2010)Google Scholar
  3. 3.
    Ayewah, N., Pugh, W., Morgenthaler, J.D., Penix, J., Zhou, Y.: Evaluating static analysis defect warnings on production software. In: Workshop on Program Analysis for Software Tools and Engineering, pp. 1–8. ACM, New York (2007)Google Scholar
  4. 4.
    Beller, M., Bholanath, R., McIntosh, S., Zaidman, A.: Analyzing the state of static analysis: a large-scale evaluation in open source software. In: International Conference on Software Analysis, Evolution, and Reengineering, vol. 1, pp. 470–481 (2016)Google Scholar
  5. 5.
    Bessey, A., et al.: A few billion lines of code later: using static analysis to find bugs in the real world. Commun. ACM 53(2), 66–75 (2010)CrossRefGoogle Scholar
  6. 6.
    Brat, G., Venet, A.: Precise and scalable static program analysis of NASA flight software. In: 2005 IEEE Aerospace Conference, pp. 1–10, March 2005Google Scholar
  7. 7.
    Christakis, M., Bird, C.: What developers want and need from program analysis: an empirical study. In: International Conference on Automated Software Engineering, pp. 332–343. ACM, New York (2016)Google Scholar
  8. 8.
    Cousot, P., Cousot, R., Fähndrich, M., Logozzo, F.: Automatic inference of necessary preconditions. In: Giacobazzi, R., Berdine, J., Mastroeni, I. (eds.) VMCAI 2013. LNCS, vol. 7737, pp. 128–148. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-35873-9_10CrossRefGoogle Scholar
  9. 9.
    Cytron, R., Ferrante, J., Rosen, B.K., Wegman, M.N., Zadeck, F.K.: Efficiently computing static single assignment form and the control dependence graph. ACM Trans. Program. Lang. Syst. 13(4), 451–490 (1991)CrossRefGoogle Scholar
  10. 10.
    Denney, E., Trac, S.: A software safety certification tool for automatically generated guidance, navigation and control code. In: 2008 IEEE Aerospace Conference, pp. 1–11, March 2008Google Scholar
  11. 11.
    Dillig, I., Dillig, T., Aiken, A.: Automated error diagnosis using abductive inference. In: Conference on Programming Language Design and Implementation, pp. 181–192. ACM, New York (2012)CrossRefGoogle Scholar
  12. 12.
    Ferrante, J., Ottenstein, K.J., Warren, J.D.: The program dependence graph and its use in optimization. ACM Trans. Program. Lang. Syst. 9(3), 319–349 (1987)CrossRefGoogle Scholar
  13. 13.
    Gehrke, M.: Bidirectional Predicate Propagation in Frama-C and its Application to Warning Removal. Master’s thesis, Hamburg University of Technology (2014)Google Scholar
  14. 14.
    Heckman, S., Williams, L.: A systematic literature review of actionable alert identification techniques for automated static code analysis. Inf. Softw. Technol. 53(4), 363–387 (2011)CrossRefGoogle Scholar
  15. 15.
    Johnson, B., Song, Y., Murphy-Hill, E., Bowdidge, R.: Why don’t software developers use static analysis tools to find bugs? In: International Conference on Software Engineering, pp. 672–681. IEEE Press, Piscataway (2013)Google Scholar
  16. 16.
    Khedker, U., Sanyal, A., Sathe, B.: Data Flow Analysis: Theory and Practice. CRC Press, Boca Raton (2009)CrossRefGoogle Scholar
  17. 17.
    Kornecki, A., Zalewski, J.: Certification of software for real-time safety-critical systems: state of the art. Innov. Syst. Softw. Eng. 5(2), 149–161 (2009)CrossRefGoogle Scholar
  18. 18.
    Kumar, S., Sanyal, A., Khedker, U.P.: Value slice: a new slicing concept for scalable property checking. In: Baier, C., Tinelli, C. (eds.) TACAS 2015. LNCS, vol. 9035, pp. 101–115. Springer, Heidelberg (2015).  https://doi.org/10.1007/978-3-662-46681-0_7CrossRefzbMATHGoogle Scholar
  19. 19.
    Layman, L., Williams, L., St. Amant, R.: Toward reducing fault fix time: understanding developer behavior for the design of automated fault detection tools. In: International Symposium on Empirical Software Engineering and Measurement, pp. 176–185 (2007)Google Scholar
  20. 20.
    Lee, W., Lee, W., Kang, D., Heo, K., Oh, H., Yi, K.: Sound non-statistical clustering of static analysis alarms. ACM Trans. Program. Lang. Syst. 39(4), 16:1–16:35 (2017)CrossRefGoogle Scholar
  21. 21.
    Lee, W., Lee, W., Yi, K.: Sound non-statistical clustering of static analysis alarms. In: Kuncak, V., Rybalchenko, A. (eds.) VMCAI 2012. LNCS, vol. 7148, pp. 299–314. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-27940-9_20CrossRefGoogle Scholar
  22. 22.
    Mangal, R., Zhang, X., Nori, A.V., Naik, M.: A user-guided approach to program analysis. In: Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering, ESEC/FSE 2015, pp. 462–473. ACM, New York (2015)Google Scholar
  23. 23.
    Meyer, B.: Design by Contract. Prentice Hall, Upper Saddle River (2002)Google Scholar
  24. 24.
    Muske, T., Baid, A., Sanas, T.: Review efforts reduction by partitioning of static analysis warnings. In: International Working Conference on Source Code Analysis and Manipulation, pp. 106–115 (2013)Google Scholar
  25. 25.
    Muske, T., Khedker, U.P.: Cause points analysis for effective handling of alarms. In: International Symposium on Software Reliability Engineering, pp. 173–184, October 2016Google Scholar
  26. 26.
    Muske, T., Serebrenik, A.: Survey of approaches for handling static analysis alarms. In: International Working Conference on Source Code Analysis and Manipulation, pp. 157–166 (2016)Google Scholar
  27. 27.
    Muske, T., Talluri, R., Serebrenik, A.: Repositioning of static analysis alarms. In: Proceedings of the 27th ACM SIGSOFT International Symposium on Software Testing and Analysis, ISSTA 2018, pp. 187–197. ACM, New York (2018)Google Scholar
  28. 28.
    Nielson, F., Nielson, H.R., Hankin, C.: Principles of Program Analysis. Springer, New York (1999).  https://doi.org/10.1007/978-3-662-03811-6CrossRefzbMATHGoogle Scholar
  29. 29.
    Rival, X.: Abstract dependences for alarm diagnosis. In: Yi, K. (ed.) APLAS 2005. LNCS, vol. 3780, pp. 347–363. Springer, Heidelberg (2005).  https://doi.org/10.1007/11575467_23CrossRefGoogle Scholar
  30. 30.
    Rival, X.: Understanding the origin of alarms in Astrée. In: Hankin, C., Siveroni, I. (eds.) SAS 2005. LNCS, vol. 3672, pp. 303–319. Springer, Heidelberg (2005).  https://doi.org/10.1007/11547662_21CrossRefGoogle Scholar
  31. 31.
    Sadowski, C., van Gogh, J., Jaspan, C., Söderberg, E., Winter, C.: Tricorder: building a program analysis ecosystem. In: International Conference on Software Engineering, pp. 598–608. IEEE Press, Piscataway (2015)Google Scholar
  32. 32.
    TCS Embedded Code Analyzer (TCS ECA). https://www.tcs.com/tcs-embedded-code-analyzer. Accessed 30 Aug 2019
  33. 33.
    Venet, A.: A practical approach to formal software verification by static analysis. Ada Lett. XXVII I(1), 92–95 (2008)CrossRefGoogle Scholar
  34. 34.
    Zhang, D., Jin, D., Gong, Y., Zhang, H.: Diagnosis-oriented alarm correlations. In: Asia-Pacific Software Engineering Conference, vol. 1, pp. 172–179 (2013)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Tukaram Muske
    • 1
    Email author
  • Rohith Talluri
    • 1
  • Alexander Serebrenik
    • 2
  1. 1.TRDDC, TCS ResearchPuneIndia
  2. 2.Eindhoven University of TechnologyEindhovenThe Netherlands

Personalised recommendations