autoC: an efficient translator for model checking deterministic scheduler based OSEK/VDX applications

  • Haitao ZhangEmail author
  • Zhuo Cheng
  • Guoqiang Li
  • Shaoying Liu
Research Paper


The OSEK/VDX automotive OS standard has been widely adopted by many automobile manufacturers, such as BMW and TOYOTA, as the basis for designing and implementing a vehicle-mounted OS. With the increasing functionalities in vehicles, more and more multi-task applications are developed based on the OSEK/VDX OS. Currently, ensuring the reliability of the developed applications is becoming a challenge for developers. As to ensure the reliability of OSEK/VDX applications, model checking as a potential solution has attracted great attention in the automotive industry. However, existing model checkers are often unable to verify a large-scale OSEK/VDX application that consists of many tasks, since the corresponding application model too complex. To make existing model checkers more scalable in verifying large-scale OSEK/VDX applications, we describe a software tool named autoC to tackle this problem by automatically translating a multi-task OSEK/VDX application into an equivalent sequential model. We conducted a series of experiments to evaluate the efficiency of autoC. The experimental results show that autoC is not only capable of efficiently sequentializing OSEK/VDX applications, but also of improving the scalability and efficiency of existing model checkers in verifying large-scale OSEK/VDX applications.


OSEK/VDX application deterministic scheduler software formal method model checking sequentialization 



This work was supported by National Natural Science Foundation of China (Grant Nos. 61602224, 61472240) and Fundamental Research Funds for the Central Universities (Grant Nos. lzujbky-2016-142, lzujbky-2016-k07).


  1. 1.
    Lemieux J. Programming in the OSEK/VDX Environment. New York: CMP Media, Inc., 2011Google Scholar
  2. 2.
    Clarke E M, Emerson E A, Sifakis J. Model checking: algorithmic verification and debugging. Commun ACM, 2009, 152: 74–84CrossRefGoogle Scholar
  3. 3.
    Clarke E M, Grumberg O, Long D E. Model checking and abstraction. ACM Trans Programm Languages Syst, 1994, 16: 1512–1542CrossRefGoogle Scholar
  4. 4.
    Pu F, Zhang W H. Combining search space partition and abstraction for LTL model checking. Sci China Ser F-Inf Sci, 2007, 50: 793–810CrossRefzbMATHGoogle Scholar
  5. 5.
    Holzmann G J. The Spin Model Checker: Primer and Reference Manual. Boston: Lucent Technologies Inc., 2003Google Scholar
  6. 6.
    Morse J, Ramalho M, Cordeiro L, et al. ESBMC 1.22. In: Proceedings of the 20th International Conference on Tools and Algorithms for the Construction and Analysis of Systems. Berlin: Springer, 2014. 405–407CrossRefGoogle Scholar
  7. 7.
    Cohen E, Dahlweid M, Hillebrand M, et al. VCC: a practical system for verifying concurrent C. In: Proceedings of the 22nd International Conference on Theorem Proving in Higher Order Logics. Berlin: Springer, 2009. 23–42CrossRefGoogle Scholar
  8. 8.
    Cordeiro L, Fischer B. Verifying multi-threaded software using SMT-based context-bounded model checking. In: Proceedings of the 33rd International Conference on Software Engineering, Waikiki, 2011. 331–340Google Scholar
  9. 9.
    Cimatti A, Micheli A, Narasamdya I, et al. Verifying SystemC: a software model checking approach. In: Proceedings of the 2010 Conference on Formal Methods in Computer-Aided Design, Lugano, 2010. 51–60Google Scholar
  10. 10.
    Zhang H T, Aoki T, Chiba Y. A spin-based approach for checking OSEK/VDX applications. In: Proceedings of the 3rd International Workshop Formal Techniques for Safety-Critical Systems (FTSCS), Paris, 2014. 239–255Google Scholar
  11. 11.
    Waszniowski L, Hanzalek Z. Formal verification of multitasking applications based on timed automata model. J Real-Time Syst, 2008, 38: 39–65CrossRefzbMATHGoogle Scholar
  12. 12.
    Clarke E M, Klieber W, Novacek M, et al. Model checking and the state explosion problem. In: Tools for Practical Software Verification. Berlin: Springer, 2012. 1–30Google Scholar
  13. 13.
    Zhang H T, Aoki T, Chiba Y. Yes! You can use your model checker to verify OSEK/VDX applications. In: Proceedings of the 8th International Conference on Software Testing, Verification and Validation, Graz, 2015. 1–10Google Scholar
  14. 14.
    Campana D, Cimatti A, Narasamdya I, et al. An analytic evaluation of SystemC encodings in Promela. In: Proceedings of the 18th International SPIN Conference on Model Checking Software, Snowbird, 2011. 90–107Google Scholar
  15. 15.
    Inverso O, Tomasco E, Fischer B, et al. Lazy-CSeq: a lazy sequentialization tool for C. In: Proceedings of International Conference on Tools and Algorithms for the Construction and Analysis of Systems, Grenoble, 2014. 398–401CrossRefGoogle Scholar
  16. 16.
    IEEE Standard for Information Technology. Portable operating system interface (POSIX) base specifications. Issue 7. Scholar
  17. 17.
    Burns A, Wellings A. Real-Time Systems and Programming Languages. 4th ed. New York: AddisonWesley Longmain, 2009zbMATHGoogle Scholar
  18. 18.
    George C N, Scott M, Shree P, et al. CIL: intermediate language and tools for analysis and transformation of C programs. In: Proceedings of the 11th International Conference on Compiler Construction. London: Springer, 2002. 213–228Google Scholar
  19. 19.
    Clarke E, Kroening D, Lerda F. A tool for checking ANSI-C programings. In: Proceedings of International Conference on Tools and Algorithms for the Construction and Analysis of Systems. Berlin: Springer, 2004. 168–176CrossRefGoogle Scholar
  20. 20.
    Yang Z J, Wang C, Gupta A, et al. Model checking sequential software programs via mixed symbolic analysis. ACM Trans Design Autom Electron Syst, 2009, 14: 1–26CrossRefGoogle Scholar
  21. 21.
    Armin B, Clarke E M, Zhu Y S. Bounded model checking. Adv Comput, 2003, 58: 117–148CrossRefGoogle Scholar
  22. 22.
    Tian C, Duan Z H. Detecting spurious counterexamples efficiently in abstract model checking. In: Proceedings of the 35th International Conference on Software Engineering, San Francisco, 2013. 202–211Google Scholar
  23. 23.
    Tian C, Duan Z H, Duan Z. Making CEGAR more efficient in software model checking. IEEE Trans Softw Eng, 2014, 40: 1206–1223CrossRefGoogle Scholar
  24. 24.
    Basili V R, Selby R W. Comparing the effectiveness of software testing strategies. IEEE Trans Softw Eng, 1987, 13: 1278–1296CrossRefGoogle Scholar
  25. 25.
    Hoffmann J, Ussath M, Holz T, et al. Slicing droids: program slicing for smali code. In: Proceedings of the 28th Annual ACM Symposium on Applied Computing, Portugal, 2013. 1844–1851CrossRefGoogle Scholar
  26. 26.
    Chen J, Aoki T. Conformance testing for OSEK/VDX operating system using model checking. In: Proceedings of the 18th Asia-Pacific Software Engineering Conference, Washington, 2011. 274–281Google Scholar
  27. 27.
    Choi Y J. Safety analysis of trampoline OS using model checking: an experience report. In: Proceedings of the IEEE 22nd International Symposium on Software Reliability Engineering, Washington, 2011. 200–209Google Scholar
  28. 28.
    Huang Y H, Zhao Y X, Zhu L F, et al. Modeling and verifying the code-level OSEK/VDX operating system with CSP. In: Proceedings of the 5th International Symposium on Theoretical Aspects of Software Engineering, Xi’an, 2011. 142–149Google Scholar
  29. 29.
    Zhang H T, Aoki T, Lin X H, et al. SMT-based bounded model checking for OSEK/VDX applications. In: Proceedings of the 20th Asia-Pacific Software Engineering Conference, Bangkok, 2013. 307–314Google Scholar
  30. 30.
    Liu Z M, Joseph M. Specification and verification of fault-tolerance, timing, and scheduling. ACM Trans Program Lang Syst, 1999, 21: 46–89CrossRefGoogle Scholar

Copyright information

© Science China Press and Springer-Verlag GmbH Germany 2017

Authors and Affiliations

  • Haitao Zhang
    • 1
    Email author
  • Zhuo Cheng
    • 2
  • Guoqiang Li
    • 3
  • Shaoying Liu
    • 4
  1. 1.School of Information Science and EngineeringLanzhou UniversityLanzhouChina
  2. 2.School of Information ScienceJapan Advanced Institute of Science and TechnologyNomiJapan
  3. 3.School of SoftwareShanghai Jiao Tong UniversityShanghaiChina
  4. 4.Faculty of Computer and Information SciencesHosei UniversityTokyoJapan

Personalised recommendations