Advertisement

An Experimental Evaluation of Deliberate Unsoundness in a Static Program Analyzer

  • Maria Christakis
  • Peter Müller
  • Valentin Wüstholz
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8931)

Abstract

Many practical static analyzers are not completely sound by design. Their designers trade soundness to increase automation, improve performance, and reduce the number of false positives or the annotation overhead. However, the impact of such design decisions on the effectiveness of an analyzer is not well understood. This paper reports on the first systematic effort to document and evaluate the sources of unsoundness in a static analyzer. We developed a code instrumentation that reflects the sources of deliberate unsoundness in the .NET static analyzer Clousot and applied it to code from six open-source projects. We found that 33% of the instrumented methods were analyzed soundly. In the remaining methods, Clousot made unsound assumptions, which were violated in 2–26% of the methods during concrete executions. Manual inspection of these methods showed that no errors were missed due to an unsound assumption, which suggests that Clousot’s unsoundness does not compromise its effectiveness. Our findings can guide users of static analyzers in using them fruitfully, and designers in finding good trade-offs.

Keywords

Test Suite Access Path Explicit Assumption Object Invariant Assumed Statement 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Barnett, M., Chang, B.-Y.E., DeLine, R., Jacobs, B., Leino, K.R.M.: Boogie: A modular reusable verifier for object-oriented programs. In: de Boer, F.S., Bonsangue, M.M., Graf, S., de Roever, W.-P. (eds.) FMCO 2005. LNCS, vol. 4111, pp. 364–387. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  2. 2.
    Bessey, A., Block, K., Chelf, B., Chou, A., Fulton, B., Hallem, S., Gros, C.-H., Kamsky, A., McPeak, S., Engler, D.R.: A few billion lines of code later: Using static analysis to find bugs in the real world. CACM 53, 66–75 (2010)CrossRefGoogle Scholar
  3. 3.
    Besson, F., Cornilleau, P.-E., Jensen, T.: Result certification of static program analysers with automated theorem provers. In: Cohen, E., Rybalchenko, A. (eds.) VSTTE 2013. LNCS, vol. 8164, pp. 304–325. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  4. 4.
    Blazy, S., Laporte, V., Maroneze, A., Pichardie, D.: Formal verification of a C value analysis based on abstract interpretation. In: Logozzo, F., Fähndrich, M. (eds.) SAS 2013. LNCS, vol. 7935, pp. 324–344. Springer, Heidelberg (2013)Google Scholar
  5. 5.
    Christakis, M., Emmisberger, P., Müller, P.: Dynamic test generation with static fields and initializers. In: Bonakdarpour, B., Smolka, S.A. (eds.) RV 2014. LNCS, vol. 8734, pp. 269–284. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  6. 6.
    Christakis, M., Müller, P., Wüstholz, V.: Collaborative verification and testing with explicit assumptions. In: Giannakopoulou, D., Méry, D. (eds.) FM 2012. LNCS, vol. 7436, pp. 132–146. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  7. 7.
    Cousot, P., Cousot, R., Feret, J., Miné, A., Mauborgne, L., Monniaux, D., Rival, X.: Varieties of static analyzers: A comparison with ASTRÉE. In: TASE, pp. 3–20. IEEE Computer Society (2007)Google Scholar
  8. 8.
    Drossopoulou, S., Francalanza, A., Müller, P., Summers, A.J.: A unified framework for verification techniques for object invariants. In: Vitek, J. (ed.) ECOOP 2008. LNCS, vol. 5142, pp. 412–437. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  9. 9.
    Fähndrich, M., Barnett, M., Logozzo, F.: Embedded contract languages. In: SAC, pp. 2103–2110. ACM (2010)Google Scholar
  10. 10.
    Fähndrich, M., Logozzo, F.: Static contract checking with abstract interpretation. In: Beckert, B., Marché, C. (eds.) FoVeOOS 2010. LNCS, vol. 6528, pp. 10–30. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  11. 11.
    Liang, P., Tripp, O., Naik, M., Sagiv, M.: A dynamic evaluation of the precision of static heap abstractions. In: OOPSLA, pp. 411–427. ACM (2010)Google Scholar
  12. 12.
    Livshits, B., Lahiri, S.K. In: defense of probabilistic static analysis. In: APPROX (2014)Google Scholar
  13. 13.
    Logozzo, F., Lahiri, S.K., Fähndrich, M., Blackshear, S.: Verification modulo versions: Towards usable verification. In: PLDI, pp. 294–304. ACM (2014)Google Scholar
  14. 14.
    Midtgaard, J., Adams, M.D., Might, M.: A structural soundness proof for Shivers’s escape technique: A case for Galois connections. In: Miné, A., Schmidt, D. (eds.) SAS 2012. LNCS, vol. 7460, pp. 352–369. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  15. 15.
    Sridharan, M., Fink, S.J.: The complexity of Andersen’s analysis in practice. In: Palsberg, J., Su, Z. (eds.) SAS 2009. LNCS, vol. 5673, pp. 205–221. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  16. 16.
    Summers, A.J., Müller, P.: Freedom before commitment: A lightweight type system for object initialisation. In: OOPSLA, pp. 1013–1032. ACM (2011)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2015

Authors and Affiliations

  • Maria Christakis
    • 1
  • Peter Müller
    • 1
  • Valentin Wüstholz
    • 1
  1. 1.Department of Computer ScienceETH ZurichSwitzerland

Personalised recommendations