Bakar Kiasan: Flexible Contract Checking for Critical Systems Using Symbolic Execution

  • Jason Belt
  • John Hatcliff
  • Robby
  • Patrice Chalin
  • David Hardin
  • Xianghua Deng
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6617)

Abstract

Spark, a subset of Ada for engineering safety and security-critical systems, is designed for verification and includes a software contract language for specifying functional properties of procedures. Even though Spark and its static analysis components are beneficial and easy to use, its contract language is almost never used due to the burdens the associated tool support imposes on developers. In this paper, we present: (a) SymExe techniques for checking software contracts in embedded critical systems, and (b) Bakar Kiasan, a tool that implements these techniques in an integrated development environment for Spark. We describe a methodology for using Bakar Kiasan that provides significant increases in automation, usability, and functionality over existing Spark tools, and we present results from experiments on its application to industrial examples.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Barnes, J.: High Integrity Software – the SPARK Approach to Safety and Security. Addison-Wesley, Reading (2003)Google Scholar
  2. 2.
    Barrett, C., Tinelli, C.: CVC3. In: Damm, W., Hermanns, H. (eds.) CAV 2007. LNCS, vol. 4590, pp. 298–302. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  3. 3.
    Belt, J., Hatcliff, J., Robby, Chalin, P., Hardin, D., Deng, X.: Bakar Kiasan: Flexible contract checking for critical systems using symbolic execution. Technical Report SAnToS-TR2011-01-03, Kansas State University (2011), http://people.cis.ksu.edu/~belt/SAnToS-TR2011-01-03.pdf
  4. 4.
    Belt, J., Robby, Deng, X.: Sireum/Topi LDP: A lightweight semi-decision procedure for optimizing symbolic execution-based analyses. In: Proceedings of the ACM SIGSOFT Symposium on the Foundations of Software Engineering (ESEC/FSE), pp. 355–364 (2009)Google Scholar
  5. 5.
    Cadar, C., Dunbar, D., Engler, D.R.: Klee: Unassisted and automatic generation of high-coverage tests for complex systems programs. In: 8th USENIX Symposium on Operating Systems Design and Implementation (OSDI), pp. 209–224. USENIX Association (2008)Google Scholar
  6. 6.
    Chang, B.-Y.E., Leino, K.R.M.: Inferring object invariants: Extended abstract. Electr. Notes Theor. Comput. Sci. 131, 63–74 (2005)CrossRefGoogle Scholar
  7. 7.
    de Moura, L.M., Bjørner, N.: Z3: An efficient SMT solver. In: Ramakrishnan, C.R., Rehof, J. (eds.) TACAS 2008. LNCS, vol. 4963, pp. 337–340. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  8. 8.
    Deng, X., Lee, J., Robby: Efficient symbolic execution algorithms for programs manipulating dynamic heap objects. Technical Report SAnToS-TR2009-09-25, Kansas State University (September 2009)Google Scholar
  9. 9.
    Deng, X., Walker, R., Robby: Program behavioral benchmarks for evaluating path-sensitive bounded verification techniques. Technical Report SAnToS-TR2010-08-20, Kansas State University (2010)Google Scholar
  10. 10.
    Dutertre, B., de Moura, L.: The Yices SMT solver (August 2006), Tool paper at http://yices.csl.sri.com/-tool-paper.pdf
  11. 11.
    Godefroid, P., Klarlund, N., Sen, K.: DART: Directed automated random testing. In: ACM SIGPLAN 2005 Conference on Programming Language Design and Implementation (PLDI), pp. 213–223. ACM Press, New York (2005)CrossRefGoogle Scholar
  12. 12.
    Grieskamp, W., Tillmann, N., Schulte, W.: XRT–exploring runtime for .NET: Architecture and applications. In: Workshop on Software Model Checking (2005)Google Scholar
  13. 13.
    Hantler, S.L., King, J.C.: An introduction to proving the correctness of programs. ACM Computing Surveys (CSUR) 8(3), 331–353 (1976)CrossRefMATHGoogle Scholar
  14. 14.
    Jackson, D.: Alloy: a lightweight object modelling notation. ACM Transactions on Software Engineering and Methodology (TOSEM) 11(2), 256–290 (2002)CrossRefGoogle Scholar
  15. 15.
    Jackson, P., Passmore, G.: Proving SPARK verification conditions with SMT solvers (2009), Draft journal article, http://homepages.inf.ed.ac.uk/pbj/papers/vct-dec09-draft.pdf
  16. 16.
    Khurshid, S., Păsăreanu, C.S., Visser, W.: Generalized symbolic execution for model checking and testing. In: Garavel, H., Hatcliff, J. (eds.) TACAS 2003. LNCS, vol. 2619, pp. 553–568. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  17. 17.
    King, J.C.: Symbolic execution and program testing. Communications of the ACM 19(7), 385–394 (1976)CrossRefMATHGoogle Scholar
  18. 18.
    Kroening, D., Strichman, O.: Decision Procedures – An Algorithmic Point of View. Springer, Heidelberg (2008)MATHGoogle Scholar
  19. 19.
    Leino, K.R.M., Logozzo, F.: Loop invariants on demand. In: Yi, K. (ed.) APLAS 2005. LNCS, vol. 3780, pp. 119–134. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  20. 20.
    Rossebo, B., Oman, P., Alves-Foss, J., Blue, R., Jaszkowiak, P.: Using SPARK-Ada to model and verify a MILS message router. In: Proceedings of the International Symposium on Secure Software Engineering (2006)Google Scholar
  21. 21.
    Rushby, J.: The design and verification of secure systems. In: 8th ACM Symposium on Operating Systems Principles, vol. 15(5), pp. 12–21 (1981)Google Scholar
  22. 22.
    Sen, K., Agha, G.: CUTE: A concolic unit testing engine for C. In: ACM SIGSOFT Symposium on the Foundations of Software Engineering (FSE), pp. 263–272 (2005)Google Scholar
  23. 23.
    Tillmann, N., de Halleux, J.: Pex–white box test generation for.NET. In: Beckert, B., Hähnle, R. (eds.) TAP 2008. LNCS, vol. 4966, pp. 134–153. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  24. 24.
    Frama-C website, http://frama-c.com/
  25. 25.

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Jason Belt
    • 1
  • John Hatcliff
    • 1
  • Robby
    • 1
  • Patrice Chalin
    • 2
  • David Hardin
    • 3
  • Xianghua Deng
    • 4
  1. 1.Kansas State UniversityUSA
  2. 2.Concordia UniversityCanada
  3. 3.Rockwell Collins Advanced Technology CenterUSA
  4. 4.Penn State University HarrisburgUSA

Personalised recommendations