Skip to main content

Modular Demand-Driven Analysis of Semantic Difference for Program Versions

  • Conference paper
  • First Online:
Static Analysis (SAS 2017)

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 10422))

Included in the following conference series:

Abstract

In this work we present a modular and demand-driven analysis of the semantic difference between program versions. Our analysis characterizes initial states for which final states in the program versions differ. It also characterizes states for which the final states are identical. Such characterizations are useful for regression verification, for revealing security vulnerabilities and for identifying changes in the program’s functionality.

Syntactic changes in program versions are often small and local and may apply to procedures that are deep in the call graph. Our approach analyses only those parts of the programs that are affected by the changes. Moreover, the analysis is modular, processing a single pair of procedures at a time. Called procedures are not inlined. Rather, their previously computed summaries and difference summaries are used. For efficiency, procedure summaries and difference summaries can be abstracted and may be refined on demand.

We have compared our method to well established tools and observed speedups of one order of magnitude and more. Furthermore, in many cases our tool proves equivalence or finds differences while the others fail to do so.

Supported by the ERC project 280053 (CPROVER), the H2020 FET OPEN 712689 SC\(^2\) and the Prof. A. Pazy Research Foundation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    We assume that \(Y=\{y_1,\dots ,y_n\}\) and \(V^v_g=\{v_1,\dots ,v_n\}\), \(y_i\) is assigned to \(v_i\) at the entry node, and \(v_i\) is assigned to \(y_i\) at the exit node.

  2. 2.

    Current values of Y are assigned to the visible variables of g, and assigned back at termination of g.

  3. 3.

    Since we assume that all inputs are given through visible variables, and therefore no hidden variable is used before it is initialized, \(V^h_p\) will not appear in \(R^{n+1}_\pi (V_p)\) and \(T^{n+1}_\pi (V_p) \downarrow _{V^v_p}\).

  4. 4.

    We use \(r(T^i_\pi [Y])\) to indicate that every \(v_k\in V^v_g\) is replaced by the expression \(T^i_\pi [y_k]\).

  5. 5.

    We use \(\lnot \) for set complement with respect to the state space.

  6. 6.

    An obvious optimization is to use the previous symbolic state for visible variables of p that are only used by g as inputs but are not changed in g. However, for simplicity of discussion we will not go into those details.

References

  1. ModDiff benchmarks. https://github.com/AnnaTrost/ModDiff/tree/master/benchmarks

  2. ModDiff tool. https://github.com/AnnaTrost/ModDiff

  3. SMACK software verifier and verification toolchain. https://github.com/smackers/smack

  4. Anand, S., Godefroid, P., Tillmann, N.: Demand-driven compositional symbolic execution. In: Ramakrishnan, C.R., Rehof, J. (eds.) TACAS 2008. LNCS, vol. 4963, pp. 367–381. Springer, Heidelberg (2008). doi:10.1007/978-3-540-78800-3_28

    Chapter  Google Scholar 

  5. Backes, J., Person, S., Rungta, N., Tkachuk, O.: Regression verification using impact summaries. In: Bartocci, E., Ramakrishnan, C.R. (eds.) SPIN 2013. LNCS, vol. 7976, pp. 99–116. Springer, Heidelberg (2013). doi:10.1007/978-3-642-39176-7_7

    Chapter  Google Scholar 

  6. Cadar, C., Palikareva, H.: Shadow symbolic execution for better testing of evolving software. In: Companion Proceedings of the 36th International Conference on Software Engineering, pp. 432–435. ACM (2014)

    Google Scholar 

  7. Cadar, C., Sen, K.: Symbolic execution for software testing: three decades later. Commun. ACM 56(2), 82–90 (2013)

    Article  Google Scholar 

  8. Clarke, E., Kroening, D., Lerda, F.: A tool for checking ANSI-C programs. In: Jensen, K., Podelski, A. (eds.) TACAS 2004. LNCS, vol. 2988, pp. 168–176. Springer, Heidelberg (2004). doi:10.1007/978-3-540-24730-2_15

    Chapter  Google Scholar 

  9. Clarke, E., Kroening, D., Sharygina, N., Yorav, K.: SATABS: SAT-based predicate abstraction for ANSI-C. In: Halbwachs, N., Zuck, L.D. (eds.) TACAS 2005. LNCS, vol. 3440, pp. 570–574. Springer, Heidelberg (2005). doi:10.1007/978-3-540-31980-1_40

    Chapter  Google Scholar 

  10. Francez, N.: Program Verification. Addison-Wesley Longman, Boston (1992)

    MATH  Google Scholar 

  11. Gao, D., Reiter, M.K., Song, D.: BinHunt: automatically finding semantic differences in binary programs. In: Chen, L., Ryan, M.D., Wang, G. (eds.) ICICS 2008. LNCS, vol. 5308, pp. 238–255. Springer, Heidelberg (2008). doi:10.1007/978-3-540-88625-9_16

    Chapter  Google Scholar 

  12. Godefroid, P.: Compositional dynamic test generation. In: ACM SigPlan Notices, vol. 42, pp. 47–54. ACM (2007)

    Google Scholar 

  13. Godlin, B., Strichman, O.: Inference rules for proving the equivalence of recursive procedures. Acta Informatica 45(6), 403–439 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  14. Godlin, B., Strichman, O.: Regression verification. In: Proceedings of the 46th Annual Design Automation Conference, pp. 466–471. ACM (2009)

    Google Scholar 

  15. Godlin, B., Strichman, O.: Regression verification: proving the equivalence of similar programs. Softw. Test. Verif. Reliab. 23(3), 241–258 (2013)

    Article  Google Scholar 

  16. Kawaguchi, M., Lahiri, S.K., Rebelo, H.: Conditional equivalence. Technical report, MSR-TR-2010-119 (2010)

    Google Scholar 

  17. King, J.C.: Symbolic execution and program testing. Commun. ACM 19(7), 385–394 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  18. Kroening, D., Strichman, O.: Equality logic and uninterpreted functions. Decision Procedures. Texts in Theoretical Computer Science (An Eatcs Series), pp. 59–80. Springer, Heidelberg (2008). doi:10.1007/978-3-540-74105-3_3

    Chapter  Google Scholar 

  19. Kroening, D., Weissenbacher, G.: Interpolation-based software verification with Wolverine. In: Gopalakrishnan, G., Qadeer, S. (eds.) CAV 2011. LNCS, vol. 6806, pp. 573–578. Springer, Heidelberg (2011). doi:10.1007/978-3-642-22110-1_45

    Chapter  Google Scholar 

  20. Lahiri, S.K., Hawblitzel, C., Kawaguchi, M., Rebêlo, H.: SYMDIFF: a language-agnostic semantic Diff tool for imperative programs. In: Madhusudan, P., Seshia, S.A. (eds.) CAV 2012. LNCS, vol. 7358, pp. 712–717. Springer, Heidelberg (2012). doi:10.1007/978-3-642-31424-7_54

    Chapter  Google Scholar 

  21. Lahiri, S.K., McMillan, K.L., Sharma, R., Hawblitzel, C.: Differential assertion checking. In: Proceedings of the 2013 9th Joint Meeting on Foundations of Software Engineering, pp. 345–355. ACM (2013)

    Google Scholar 

  22. McMillan, K.L.: Lazy abstraction with interpolants. In: Ball, T., Jones, R.B. (eds.) CAV 2006. LNCS, vol. 4144, pp. 123–136. Springer, Heidelberg (2006). doi:10.1007/11817963_14

    Chapter  Google Scholar 

  23. Partush, N., Yahav, E.: Abstract semantic differencing for numerical programs. In: Logozzo, F., Fähndrich, M. (eds.) SAS 2013. LNCS, vol. 7935, pp. 238–258. Springer, Heidelberg (2013). doi:10.1007/978-3-642-38856-9_14

    Chapter  Google Scholar 

  24. Partush, N., Yahav, E.: Abstract semantic differencing via speculative correlation. In: ACM SIGPLAN Notices, vol. 49, pp. 811–828. ACM (2014)

    Google Scholar 

  25. Person, S., Dwyer, M.B., Elbaum, S., Pasareanu, C.S.: Differential symbolic execution. In: Foundations of Software Engineering, pp. 226–237. ACM (2008)

    Google Scholar 

  26. Person, S., Yang, G., Rungta, N., Khurshid, S.: Directed incremental symbolic execution. In: ACM SIGPLAN Notices, vol. 46, pp. 504–515. ACM (2011)

    Google Scholar 

  27. Ramos, D.A., Engler, D.: Under-constrained symbolic execution: correctness checking for real code. In: 24th USENIX Security Symposium, pp. 49–64 (2015)

    Google Scholar 

  28. Ramos, D.A., Engler, D.R.: Practical, low-effort equivalence verification of real code. In: Gopalakrishnan, G., Qadeer, S. (eds.) CAV 2011. LNCS, vol. 6806, pp. 669–685. Springer, Heidelberg (2011). doi:10.1007/978-3-642-22110-1_55

    Chapter  Google Scholar 

  29. Trostanetski, A., Grumberg, O., Kroening, D.: Modular demand-driven analysis of semantic difference for program versions. Technical report, CS-2017-02. http://www.cs.technion.ac.il/users/wwwb/cgi-bin/tr-info.cgi/2017/CS/CS-2017-02

  30. Wong, W.E., Horgan, J.R., London, S., Agrawal, H.: A study of effective regression testing in practice. In: The Eighth International Symposium on Software Reliability Engineering, Proceedings, pp. 264–274. IEEE (1997)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Anna Trostanetski , Orna Grumberg or Daniel Kroening .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Trostanetski, A., Grumberg, O., Kroening, D. (2017). Modular Demand-Driven Analysis of Semantic Difference for Program Versions. In: Ranzato, F. (eds) Static Analysis. SAS 2017. Lecture Notes in Computer Science(), vol 10422. Springer, Cham. https://doi.org/10.1007/978-3-319-66706-5_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-66706-5_20

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-66705-8

  • Online ISBN: 978-3-319-66706-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics