Automated test generation using model checking: an industrial evaluation

  • Eduard P. EnoiuEmail author
  • Adnan Čaušević
  • Thomas J. Ostrand
  • Elaine J. Weyuker
  • Daniel Sundmark
  • Paul Pettersson
ICTSS 2013


In software development, testers often focus on functional testing to validate implemented programs against their specifications. In safety-critical software development, testers are also required to show that tests exercise, or cover, the structure and logic of the implementation. To achieve different types of logic coverage, various program artifacts such as decisions and conditions are required to be exercised during testing. Use of model checking for structural test generation has been proposed by several researchers. The limited application to models used in practice and the state space explosion can, however, impact model checking and hence the process of deriving tests for logic coverage. Thus, there is a need to validate these approaches against relevant industrial systems such that more knowledge is built on how to efficiently use them in practice. In this paper, we present a tool-supported approach to handle software written in the Function Block Diagram language such that logic coverage criteria can be formalized and used by a model checker to automatically generate tests. To this end, we conducted a study based on industrial use-case scenarios from Bombardier Transportation AB, showing how our toolbox CompleteTest can be applied to generate tests in software systems used in the safety-critical domain. To evaluate the approach, we applied the toolbox to 157 programs and found that it is efficient in terms of time required to generate tests that satisfy logic coverage and scales well for most of the programs.


Automated test generation Software testing Model checking  Uppaal Logic coverage safety-critical systems IEC 1131-3 FBD Function Block Diagram Structured Text  PLC Programmable Logic Controllers Model-based testing 



This research was supported by VINNOVA, the Swedish Governmental Agency for Innovation Systems within the ATAC project and The Knowledge Foundation (KKS) through the project 20130085 Testing of Critical System Characteristics (TOCSYC).


  1. 1.
    Alur, R.: Timed automata. In: Computer aided verification, pp. 688–688. Springer, Berlin (1999)Google Scholar
  2. 2.
    Alur, R., Courcoubetis, C., Dill, D.: Model-checking in dense real-time. Inf. Comput. 104(1), 2–34 (1993)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Alur, R., Dill, D.: Automata for modeling real-time systems. Autom. Lang. Progr. pp. 322–335 (1990)Google Scholar
  4. 4.
    Ammann, P., Black, P.E., Ding, W.: Model checkers in software testing. In: NIST-IR 6777, National Institute of Standards and Technology Report (2002)Google Scholar
  5. 5.
    Ammann, P., Offutt, J.: Introduction to Software Testing. Cambridge University Press, Cambridge (2008)CrossRefzbMATHGoogle Scholar
  6. 6.
    Behrmann, G., Fehnker, A., Hune, T., Larsen, K.G, Petterson, P., Romijn, J.: Guiding and cost-optimality in uppaal. In: AAAI-Spring Symposium on Model-based Validation of Intelligence, pp. 66–74 (2001)Google Scholar
  7. 7.
    Black, P.: Modeling and marshaling: making tests from model checker counter-examples. In: Proceedings of the 19th digital avionics systems conference, vol. 1, pp. 1B3-1. IEEE Press, New York (2000)Google Scholar
  8. 8.
    CENELEC. 50128: Railway application-communications, signaling and processing systems-software for railway control and protection systems. Standard Report (2001)Google Scholar
  9. 9.
    Chilenski, J.J., Miller, S.P.: Applicability of modified condition/decision coverage to software testing. Softw. Eng. J. 9(5), 193–200 (1994)CrossRefGoogle Scholar
  10. 10.
    Dierks, H.: Plc-automata: a new class of implementable real-time automata. Theor. Comput. Sci. 253(1), 61–93 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Enoiu, E.P., Sundmark, D., Pettersson, P.: Model-based test suite generation for function block diagrams using the UPPAAL model checker. In: 2013 IEEE Sixth International Conference on Software Testing, Verification and Validation Workshops (ICSTW), pp. 158–167. IEEE Press, New York (2013)Google Scholar
  12. 12.
    Enoiu, E.P., Sundmark, D., Pettersson, P.: Using logic coverage to improve testing function block diagrams. In: Testing Software and Systems, pp. 1–16. Springer, Berlin (2013)Google Scholar
  13. 13.
    Fraser, G., Wotawa, F., Ammann, P.E.: Testing with model checkers: a survey. In: Journal on Software Testing, Verification and Reliability, vol. 19, pp. 215–261. Wiley Online Library, New York (2009)Google Scholar
  14. 14.
    Gargantini, A., Heitmeyer, C.: Using model checking to generate tests from requirements specifications. In: Software Engineering, ESEC/FSE99, pp. 146–162. Springer, Berlin (1999)Google Scholar
  15. 15.
    Hessel, A., Larsen, K., Mikucionis, M., Nielsen, B., Pettersson, P., Skou, A.: Testing real-time systems using UPPAAL. Form. Methods Test. pp. 77–117 (2008)Google Scholar
  16. 16.
    Hessel, A., Larsen, K., Nielsen, B., Pettersson, P., Skou, A.: Time-optimal real-time test case generation using UPPAAL. In: Lecture notes in computer science. Formal approaches to software testing, pp. 114–130. Springer, Berlin (2004)Google Scholar
  17. 17.
    Hong, H.S., Lee, I., Sokolsky, O., Ural, H.: A temporal logic-based theory of test coverage and generation. In: Tools and Algorithms for the Construction and Analysis of Systems, pp. 327–341. Springer, Berlin (2002)Google Scholar
  18. 18.
    Jee, E., Kim, S., Cha, S., Lee, I.: Automated test coverage measurement for reactor protection system software implemented in function block diagram. In: Journal on Computer Safety, Reliability, and Security, pp. 223–236. Springer, Berlin (2010)Google Scholar
  19. 19.
    Jee, E., Yoo, J., Cha, S., Bae, D.: A data flow-based structural testing technique for FBD programs. Inf. Softw. Technol. 51(7), 1131–1139 (2009)CrossRefGoogle Scholar
  20. 20.
    Khurshid, S., Păsăreanu, C.S., Visser, W.: Generalized symbolic execution for model checking and testing. In: Tools and Algorithms for the Construction and Analysis of Systems, pp. 553–568. Springer, Berlin (2003)Google Scholar
  21. 21.
    Lakehal, A., Parissis, I.: Lustructu: a tool for the automatic coverage assessment of lustre programs. In: International Symposium on Software Reliability Engineering, p. 10. IEEE Press, New York (2005)Google Scholar
  22. 22.
    Larsen, K.G., Pettersson, P., Yi, W.: UPPAAL in a Nutshell. Int. J.Softw. Tools Technol. Transf. 1(1), 134–152 (1997)CrossRefzbMATHGoogle Scholar
  23. 23.
    Öhman, M., Johansson, S., Årzén, K.E.: Implementation aspects of the PLC standard IEC 1131–3. In: Journal on Control Engineering Practice, vol. 6, pp. 547–555. Elsevier, New York (1998)Google Scholar
  24. 24.
    Rayadurgam, S., Heimdahl, M.P.E.: Generating MC/DC adequate test sequences through model checking. In: NASA Goddard Software Engineering Workshop Proceedings, pp. 91–96. IEEE Press, New York (2003)Google Scholar
  25. 25.
    Rayadurgam, S., Heimdahl, M.P.E.: Coverage based test-case generation using model checkers. In: ECBS 2001. Proceedings of Eighth Annual IEEE International Conference and Workshop on the Engineering of Computer Based Systems, pp. 83–91. IEEE Press, New York (2001)Google Scholar
  26. 26.
    Seppi, K., Jones, M., Lamborn, P.: Guided model checking with a bayesian meta-heuristic. Fundam. Inf. 70(1), 111–126 (2006)Google Scholar
  27. 27.
    Thieme, J., Hanisch, H.M.: Model-based generation of modular PLC code using IEC61131 function blocks. In: Proceedings of the International Symposium on Industrial Electronics, vol. 1, pp. 199–204. IEEE Press, New York (2002)Google Scholar
  28. 28.
    Whalen, M., Gay, G., You, D., Heimdahl, M.P.E., Staats, M.: Observable modified condition/decision coverage. In: Proceedings of the International Conference on Software Engineering, pp. 102–111. IEEE Press, New York (2013)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  • Eduard P. Enoiu
    • 1
    Email author
  • Adnan Čaušević
    • 1
  • Thomas J. Ostrand
    • 3
  • Elaine J. Weyuker
    • 1
  • Daniel Sundmark
    • 1
    • 2
  • Paul Pettersson
    • 1
  1. 1.Mälardalen UniversityVästeråsSweden
  2. 2.Swedish Institute of Computer ScienceStockholmSweden
  3. 3.Software Engineering Research ConsultantVästeråsSweden

Personalised recommendations