Advertisement

Frama-C, A Collaborative Framework for C Code Verification: Tutorial Synopsis

  • Nikolai Kosmatov
  • Julien SignolesEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10012)

Abstract

Frama-C is a source code analysis platform that aims at conducting verification of industrial-size C programs. It provides its users with a collection of plug-ins that perform static and dynamic analysis for safety- and security-critical software. Collaborative verification across cooperating plug-ins is enabled by their integration on top of a shared kernel, and their compliance to a common specification language, ACSL.

This paper presents a three-hour tutorial on Frama-C in which we provide a comprehensive overview of its most important plug-ins: the abstract-interpretation based plug-in Value, the deductive verification tool WP, the runtime verification tool E-ACSL and the test generation tool PathCrawler. We also emphasize different possible collaborations between these plug-ins and a few others. The presentation is illustrated on concrete examples of C programs.

Keywords

Frama-C ACSL Abstract interpretation Deductive verification Runtime verification Test generation Combinations of analyses 

References

  1. 1.
    Boulanger, J.L. (ed.): Industrial Use of Formal Methods: Formal Verification. Wiley-ISTE, New York (2012)Google Scholar
  2. 2.
    Kirchner, F., Kosmatov, N., Prevosto, V., Signoles, J., Yakobowski, B.: Frama-C: a software analysis perspective. Formal Aspects Comput. 27(3), 573–609 (2015)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Baudin, P., Filliâtre, J.C., Hubert, T., Marché, C., Monate, B., Moy, Y., Prevosto, V.: ACSL: ANSI/ISO C Specification Language. http://frama-c.com/acsl.html
  4. 4.
    Kosmatov, N., Williams, N., Botella, B., Roger, M., Chebaro, O.: A lesson on structural testing with PathCrawler-online.com. In: Brucker, A.D., Julliand, J. (eds.) TAP 2012. LNCS, vol. 7305, pp. 169–175. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-30473-6_15 CrossRefGoogle Scholar
  5. 5.
    Williams, N., Kosmatov, N.: Structural testing with PathCrawler: tutorial synopsis. In: International Conference on Quality Software (QSIC 2012), pp. 289–292. IEEE (2012)Google Scholar
  6. 6.
    Kosmatov, N., Prevosto, V., Signoles, J.: A lesson on proof of programs with frama-C. Invited Tutorial Paper. In: Veanes, M., Viganò, L. (eds.) TAP 2013. LNCS, vol. 7942, pp. 168–177. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-38916-0_10 CrossRefGoogle Scholar
  7. 7.
    Kosmatov, N., Signoles, J.: A lesson on runtime assertion checking with frama-C. In: Legay, A., Bensalem, S. (eds.) RV 2013. LNCS, vol. 8174, pp. 386–399. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-40787-1_29 CrossRefGoogle Scholar
  8. 8.
    Kosmatov, N., Signoles, J.: Runtime assertion checking and its combinations with static and dynamic analyses. In: Seidl, M., Tillmann, N. (eds.) TAP 2014. LNCS, vol. 8570, pp. 165–168. Springer, Heidelberg (2014). doi: 10.1007/978-3-319-09099-3_13 Google Scholar
  9. 9.
    Baudin, P., Bobot, F., Correnson, L., Dargaye, Z.: WP plug-in manual. http://frama-c.com/wp.html
  10. 10.
    Cuoq, P., Yakobowski, B., Prevosto, V.: Frama-C’s value analysis plug-in. http://frama-c.com/download/value-analysis.pdf
  11. 11.
    Delahaye, M., Kosmatov, N., Signoles, J.: Common specification language for static and dynamic analysis of C programs. In: the 28th Annual ACM Symposium on Applied Computing (SAC 2013), pp. 1230–1235. ACM (2013)Google Scholar
  12. 12.
    Signoles, J.: E-ACSL user manual. http://frama-c.com/download/e-acsl/e-acsl-manual.pdf
  13. 13.
    Williams, N., Marre, B., Mouy, P., Roger, M.: PathCrawler: automatic generation of path tests by combining static and dynamic analysis. In: Cin, M., Kaâniche, M., Pataricza, A. (eds.) EDCC 2005. LNCS, vol. 3463, pp. 281–292. Springer, Heidelberg (2005). doi: 10.1007/11408901_21 CrossRefGoogle Scholar
  14. 14.
    Botella, B., Delahaye, M., Hong-Tuan-Ha, S., Kosmatov, N., Mouy, P., Roger, M., Williams, N.: Automating structural testing of C programs: experience with PathCrawler. In: International Workshop on Automation of Software Test (AST 2009), pp. 70–78. IEEE (2009)Google Scholar
  15. 15.
    Cuoq, P., Signoles, J.: Experience report: Ocaml for an industrial-strength static analysis framework. In: International Confererence on Functional Programming (ICFP 2009), pp. 281–286 (2009)Google Scholar
  16. 16.
    Signoles, J.: Software architecture of code analysis frameworks matters: the Frama-C example. In: Workshop on Formal Integrated Development Environment (F-IDE 2015), pp. 86–96 (2015)Google Scholar
  17. 17.
    Correnson, L., Signoles, J.: Combining analyses for C program verification. In: Stoelinga, M., Pinger, R. (eds.) FMICS 2012. LNCS, vol. 7437, pp. 108–130. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-32469-7_8 CrossRefGoogle Scholar
  18. 18.
    Meyer, B.: Object-oriented Software Construction. Object-oriented Series, 2nd edn. Prentice Hall, New York (1997)zbMATHGoogle Scholar
  19. 19.
    Leavens, G.T., Cheon, Y., Clifton, C., Ruby, C., Cok, D.R.: How the design of JML accommodates both runtime assertion checking and formal verification. In: Boer, F.S., Bonsangue, M.M., Graf, S., Roever, W.-P. (eds.) FMCO 2002. LNCS, vol. 2852, pp. 262–284. Springer, Heidelberg (2003). doi: 10.1007/978-3-540-39656-7_11 CrossRefGoogle Scholar
  20. 20.
    Dijkstra, E.W.: Guarded commands, nondeterminacy and formal derivation of programs. Commun. ACM 18(8), 453–457 (1975)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Correnson, L.: Qed. Computing what remains to be proved. In: Badger, J.M., Rozier, K.Y. (eds.) NFM 2014. LNCS, vol. 8430, pp. 215–229. Springer, Heidelberg (2014). doi: 10.1007/978-3-319-06200-6_17 CrossRefGoogle Scholar
  22. 22.
    Burghardt, J., Gerlach, J., Lapawczyk, T.: ACSL by example (2016). https://gitlab.fokus.fraunhofer.de/verification/open-acslbyexample/blob/master/ACSL-by-Example.pdf
  23. 23.
    Cousot, P., Cousot, R.: Abstract interpretation: a unified lattice model for static analysis of programs by construction or approximation of fixpoints. In: Principles of Programming Languages (POPL 1977), pp. 238–252. ACM Press (1977)Google Scholar
  24. 24.
    Deutsch, A.: Static verification of dynamic properties. PolySpace White Paper (2003)Google Scholar
  25. 25.
    Cousot, P., Cousot, R., Feret, J., Mauborgne, L., Min, A., Monniaux, D., Rival, X.: The ASTRE analyzer. In: Sagiv, M. (ed.) ESOP 2005. LNCS, vol. 3444, pp. 21–30. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  26. 26.
    Feret, J.: Static analysis of digital filters. In: Schmidt, D. (ed.) ESOP 2004. LNCS, vol. 2986, pp. 33–48. Springer, Heidelberg (2004). doi: 10.1007/978-3-540-24725-8_4 CrossRefGoogle Scholar
  27. 27.
    Berthomé, P., Heydemann, K., Kauffmann-Tourkestansky, X., Lalande, J.F.: Attack model for verification of interval security properties for smart card C codes. In: Programming Languages and Analysis for Security (PLAS 2010), pp. 1–12. ACM (2010)Google Scholar
  28. 28.
    Cuoq, P., Delmas, D., Duprat, S., Moya Lamiel, V.: Fan-C, a Frama-C plug-in for data flow verification. In: Embedded Real-Time Software and Systems Congress (ERTS22012) (2012)Google Scholar
  29. 29.
    Demay, J.C., Totel, E., Tronel, F.: SIDAN: a tool dedicated to software instrumentation for detecting attacks on non-control-data. In: International Conference on Risks and Security of Internet and Systems (CRiSIS 2009), pp. 51–58. IEEE (2009)Google Scholar
  30. 30.
    TrustInSoft: tis-ct blog post. http://trust-in-soft.com/tis-ct/
  31. 31.
    Bonichon, R., Cuoq, P.: A mergeable interval map. Studia Inform. Univ. 9(1), 5–37 (2011)Google Scholar
  32. 32.
    ISO/IEC 9899:1999: Programming languages – CGoogle Scholar
  33. 33.
    Mauborgne, L., Rival, X.: Trace partitioning in abstract interpretation based static analyzers. In: Sagiv, M. (ed.) ESOP 2005. LNCS, vol. 3444, pp. 5–20. Springer, Heidelberg (2005). doi: 10.1007/978-3-540-31987-0_2 CrossRefGoogle Scholar
  34. 34.
    Jeannet, B., Miné, A.: Apron: a library of numerical abstract domains for static analysis. In: Computer Aided Verification (CAV 2009), pp. 661–667 (2009)Google Scholar
  35. 35.
    Venet, A.J.: The gauge domain: scalable analysis of linear inequality invariants. In: Madhusudan, P., Seshia, S.A. (eds.) CAV 2012. LNCS, vol. 7358, pp. 139–154. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-31424-7_15 CrossRefGoogle Scholar
  36. 36.
    Signoles, J.: E-ACSL: Executable ANSI/ISO C Specification Language, May 2015. http://frama-c.com/download/e-acsl/e-acsl.pdf
  37. 37.
    Chalin, P.: Engineering a sound assertion semantics for the verifying compiler. IEEE Trans. Softw. Eng. 36, 275–287 (2010)CrossRefGoogle Scholar
  38. 38.
    Falcone, Y., Havelund, K., Reger, G.: A tutorial on runtime verification. In: Broy, M., Peled, D., Kalus, G. (eds.) Engineering Dependable Software Systems. NATO Science for Peace and Security Series - D: Information and Communication Security, vol. 34, pp. 141–175. IOS Press, Amsterdam (2013)Google Scholar
  39. 39.
    Bartocci, E., Bonakdarpour, B., Falcone, Y., Colombo, C., Decker, N., Klaedtke, F., Havelund, K., Joshi, Y., Milewicz, R., Reger, G., Rosu, G., Signoles, J., Thoma, D., Zalinescu, E., Zhang., Y.: First International Competition on Runtime Verification. Rules, Benchmarks, Tools and Final Results of CRV 2014 (Submitted)Google Scholar
  40. 40.
    Jakobsson, A., Kosmatov, N., Signoles, J.: Rester statique pour devenir plus rapide, plus précis et plus mince. In: Journes Francophones des Langages Applicatifs (JFLA 2015) (2015) (in French)Google Scholar
  41. 41.
    Kosmatov, N., Petiot, G., Signoles, J.: An optimized memory monitoring for runtime assertion checking of C programs. In: Legay, A., Bensalem, S. (eds.) RV 2013. LNCS, vol. 8174, pp. 167–182. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-40787-1_10 CrossRefGoogle Scholar
  42. 42.
    Jakobsson, A., Kosmatov, N., Signoles, J.: Expressive as a tree: optimized memory monitoring for C (Submitted)Google Scholar
  43. 43.
    Cadar, C., Godefroid, P., Khurshid, S., Pasareanu, C.S., Sen, K., Tillmann, N., Visser, W.: Symbolic execution for software testing in practice: preliminary assessment. In: International Conference on Software Engineering (ICSE 2011), pp. 1066–1071. ACM (2011)Google Scholar
  44. 44.
    Bardin, S., Kosmatov, N., Cheynier, F.: Efficient leveraging of symbolic execution to advanced coverage criteria. In: International Conference on Software Testing, Verification and Validation (ICST 2014), pp. 173–182. IEEE (2014)Google Scholar
  45. 45.
    Bardin, S., Chebaro, O., Delahaye, M., Kosmatov, N.: An all-in-one toolkit for automated white-box testing. In: Seidl, M., Tillmann, N. (eds.) TAP 2014. LNCS, vol. 8570, pp. 53–60. Springer, Heidelberg (2014). doi: 10.1007/978-3-319-09099-3_4 Google Scholar
  46. 46.
    Chebaro, O., Kosmatov, N., Giorgetti, A., Julliand, J.: Program slicing enhances a verification technique combining static and dynamic analysis. In: The ACM Symposium on Applied Computing (SAC 2012), pp. 1284–1291. ACM (2012)Google Scholar
  47. 47.
    Chebaro, O., Cuoq, P., Kosmatov, N., Marre, B., Pacalet, A., Williams, N., Yakobowski, B.: Behind the scenes in SANTE: a combination of static and dynamic analyses. Autom. Softw. Eng. 21(1), 107–143 (2014)CrossRefGoogle Scholar
  48. 48.
    Kiss, B., Kosmatov, N., Pariente, D., Puccetti, A.: Combining static and dynamic analyses for vulnerability detection: illustration on heartbleed. In: Piterman, N. (ed.) HVC 2015. LNCS, vol. 9434, pp. 39–50. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-26287-1_3 CrossRefGoogle Scholar
  49. 49.
    Petiot, G., Kosmatov, N., Botella, B., Giorgetti, A., Julliand, J.: Your proof fails? Testing helps to find the reason. In: Aichernig, B.K.K., Furia, C.A.A. (eds.) TAP 2016. LNCS, vol. 9762, pp. 130–150. Springer, Heidelberg (2016). doi: 10.1007/978-3-319-41135-4_8 CrossRefGoogle Scholar
  50. 50.
    Bardin, S., Delahaye, M., David, R., Kosmatov, N., Papadakis, M., Traon, Y.L., Marion, J.: Sound and quasi-complete detection of infeasible test requirements. In: International Conference on Software Testing, Verification and Validation (ICST 2015), pp. 1–10. IEEE (2015)Google Scholar
  51. 51.
    Bishop, P.G., Bloomfield, R.E., Cyra, L.: Combining testing and proof to gain high assurance in software: a case study. In: International Symposium on Software Reliability Engineering (ISSRE 2013), pp. 248–257. IEEE (2013)Google Scholar
  52. 52.
    Cuoq, P., Hilsenkopf, P., Kirchner, F., Labb, S., Thuy, N., Yakobowski, B.: Formal verification of software important to safety using the Frama-C tool suite. In: International Topical Meeting on Nuclear Plant Instrumentation, Control and Human Machine Interface Technologies (NPIC & HMIT) (2012)Google Scholar
  53. 53.
    Delmas, D., Duprat, S., Moya-Lamiel, V., Signoles, J.: Taster, a Frama-C plug-in to enforce coding standards. In: Embedded Real-Time Software and Systems Congress (ERTS22010)Google Scholar
  54. 54.
    Pariente, D., Ledinot, E.: Formal verification of industrial C code using Frama-C: a case study. In: International Conference on Formal Verification of Object-Oriented Software (FoVeOOS 2010) (2010)Google Scholar
  55. 55.
    Ceara, D., Mounier, L., Potet, M.L.: Taint dependency sequences: A characterization of insecure execution paths based on input-sensitive cause sequences. In: the 3rd International Conference on Software Testing, Verification and Validation Workshops (ICSTW 2010), pp. 371–380 (2010)Google Scholar
  56. 56.
    Ayache, N., Amadio, R., Régis-Gianas, Y.: Certifying and reasoning on cost annotations in C programs. In: Formal Methods for Industrial Critical Systems (FMICS 2012) (2012)Google Scholar
  57. 57.
    Carvalho, N., Silva Sousa, C., Pinto, J.S., Tomb, A.: Formal verification of kLIBC with the WP frama-C plug-in. In: Badger, J.M., Rozier, K.Y. (eds.) NFM 2014. LNCS, vol. 8430, pp. 343–358. Springer, Heidelberg (2014). doi: 10.1007/978-3-319-06200-6_29 CrossRefGoogle Scholar
  58. 58.
    Gavran, I., Niksic, F., Kanade, A., Majumdar, R., Vafeiadis, V.: Rely/guarantee reasoning for asynchronous programs. In: International Conference on Concurrency Theory (CONCUR 2015), pp. 483–496 (2015)Google Scholar
  59. 59.
    Nguena-Timo, O., Langelier, G.: Test data generation for cyclic executives with CBMC and frama-C: a case study. Electr. Notes Theor. Comput. Sci. 320, 35–51 (2016)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.CEA, LIST, Software Reliability and Security LaboratoryGif-sur-YvetteFrance

Personalised recommendations