Environment-Model Based Testing of Control Systems: Case Studies

  • Erwan Jahier
  • Simplice Djoko-Djoko
  • Chaouki Maiza
  • Eric Lafont
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8413)


A reactive system reacts to an environment it tries to control. Lurette is a black-box testing tool for such closed-loop systems. It focuses on environment modeling using Lutin, a language designed to perform guided random exploration of the System Under Test (SUT) environment, taking into account the feedback. The test decision is automated using Lustre oracles resulting from the formalisation of functional requirements.

In this article, we report on experimentations conducted with Lurette on two industrial case studies. One deals with a dynamic system which simulates the behavior of the temperature and the pressure of a fluid in a pipe. The other one reports on how Lurette can be used to automate the processing of an existing test booklet of a Supervisory Control and Data Acquisition (SCADA) library module.


Reactive systems Control-command Dynamic systems SCADA Test Booklets Black-box testing Requirements engineering Synchronous Languages 


  1. 1.
    Halbwachs, N., Caspi, P., Raymond, P., Pilaud, D.: The synchronous dataflow programming language Lustre. Proceedings of the IEEE 79(9), 1305–1320 (1991)CrossRefGoogle Scholar
  2. 2.
    Raymond, P., Roux, Y., Jahier, E.: Lutin: a language for specifying and executing reactive scenarios. EURASIP Journal on Embedded Systems (2008)Google Scholar
  3. 3.
    Jahier, E., Halbwachs, N., Raymond, P.: Engineering functional requirements of reactive systems using synchronous languages. In: International Symposium on Industrial Embedded Systems, SIES 2013, Porto, Portugal (2013)Google Scholar
  4. 4.
    Halbwachs, N., Fernandez, J.C., Bouajjanni, A.: An executable temporal logic to express safety properties and its connection with the language lustre. In: ISLIP 1993, Quebec (1993)Google Scholar
  5. 5.
    Jahier, E., Raymond, P., Baufreton, P.: Case studies with lurette v2. Software Tools for Technology Transfer 8(6), 517–530 (2006)CrossRefGoogle Scholar
  6. 6.
    Jahier, E., Raymond, P.: Generating random values using binary decision diagrams and convex polyhedra. In: CSTVA, Nantes, France (2006)Google Scholar
  7. 7.
    Raymond, P.: Synchronous program verification with lustre/lesar. In: Modeling and Verification of Real-Time Systems. ISTE/Wiley (2008)Google Scholar
  8. 8.
    Bailey, D., Wright, E.: Practical SCADA for industry. Elsevier (2003)Google Scholar
  9. 9.
    The Mathworks: Simulink/stateflow,
  10. 10.
    Hamon, G., de Moura, L., Rushby, J.: Generating efficient test sets with a model checker. In: Software Engineering and Formal Methods, pp. 261–270 (2004)Google Scholar
  11. 11.
    Satpathy, M., Yeolekar, A., Ramesh, S.: Randomized directed testing (redirect) for simulink/stateflow models. In: Proceedings of the 8th ACM International Conference on Embedded Software, EMSOFT 2008, pp. 217–226. ACM, New York (2008)Google Scholar
  12. 12.
    Zhan, Y., Clark, J.A.: A search-based framework for automatic testing of MATLAB/Simulink models. Journal of Systems and Software 81(2), 262–285 (2008)CrossRefGoogle Scholar
  13. 13.
    TNI Software: Safety Test Builder,
  14. 14.
    The Mathworks: Design verifier,
  15. 15.
    Broy, M., Jonsson, B., Katoen, J.-P., Leucker, M., Pretschner, A. (eds.): Model-Based Testing of Reactive Systems. LNCS, vol. 3472. Springer, Heidelberg (2005)zbMATHGoogle Scholar
  16. 16.
    Zander, J., Schieferdecker, I., Mosterman, P.J.: 1. In: A Taxonomy of Model-based Testing for Embedded Systems from Multiple Industry Domains, pp. 3–22. CRC Press (2011)Google Scholar
  17. 17.
    T-VEC: T-vec tester,
  18. 18.
    Blackburn, M., Busser, R., Nauman, A., Knickerbocker, R., Kasuda, R.: Mars polar lander fault identification using model-based testing. In: 8th IEEE International Conference on Engineering of Complex Computer Systems, pp. 163–169 (2002)Google Scholar
  19. 19.
    Reactive Systems: Testing and validation of simulink models with reactis white paperGoogle Scholar
  20. 20.
    Cu, C., Jeppu, Y., Hariram, S., Murthy, N., Apte, P.: A new input-output based model coverage paradigm for control blocks. In: 2011 IEEE Aerospace Conference, pp. 1–12 (2011)Google Scholar
  21. 21.
    Piketec: Tpt,
  22. 22.
    Lehmann, E.: Time partition testing: A method for testing dynamic functional behaviour. In: Proceedings of TEST 2000, London, Great Britain (2000)Google Scholar
  23. 23.
    Bringmann, E., Kramer, A.: Model-based testing of automotive systems. In: 2008 1st International Conference on Software Testing, Verification, and Validation, pp. 485–493 (2008)Google Scholar
  24. 24.
    Vos, T.E., Lindlar, F.F., Wilmes, B., Windisch, A., Baars, A.I., Kruse, P.M., Gross, H., Wegener, J.: Evolutionary functional black-box testing in an industrial setting. Software Quality Control 21(2), 259–288 (2013)CrossRefGoogle Scholar
  25. 25.
    Baresel, A., Pohlheim, H., Sadeghipour, S.: Structural and functional sequence test of dynamic and state-based software with evolutionary algorithms. In: Cantú-Paz, E., et al. (eds.) GECCO 2003. LNCS, vol. 2724, pp. 2428–2441. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  26. 26.
    McMinn, P.: Search-based software test data generation: a survey: Research articles. Softw. Test. Verif. Reliab. 14(2), 105–156 (2004)CrossRefGoogle Scholar
  27. 27.
    Argosim: Stimulus,

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  • Erwan Jahier
    • 1
  • Simplice Djoko-Djoko
    • 1
  • Chaouki Maiza
    • 1
  • Eric Lafont
    • 2
  1. 1.VERIMAG-CNRSGrenobleFrance
  2. 2.ATOS-WORLDGRIDGrenobleFrance

Personalised recommendations