Automated Software Engineering

, Volume 24, Issue 2, pp 369–391 | Cite as

Test oracles for simulink-like models

Article
  • 240 Downloads

Abstract

The design of embedded systems is often supported by the definition of executable models for tools like Matlab/Simulink or Scilab/Xcos. These models play a pivotal role in the development process and their correctness is thus extremely important. Many different solutions exist for the definition of suitable tests to “exercise” these models, but only a few (partial) solutions exist for assessing the quality of execution (simulation) results, that is, for defining suitable oracles. This paper addresses the problem and proposes a formal language for specifying the oracles and relating them to existing models. It also presents Apolom, a prototype tool for checking simulation results against stated oracles. The empirical assessment we conducted to assess the viability of the proposed solution is organized around four case studies and witnesses interesting results in terms of effectiveness, efficiency, and required resources.

Keywords

Embedded software Matlab/Simulink Test oracle 

References

  1. Aichernig, B.: Automated black-box testing with abstract VDM oracle. In: Proceedings of the 18th International Conference Computer Safety, Reliability and Security, Lecture Notes in Computer Science 1698, pp. 250–259. Springer, New York (1999)Google Scholar
  2. Alligood, K., Sauer, T., Yorke, J.: Chaos: An Introduction to Dynamical Systems. Springer, New York (2000)MATHGoogle Scholar
  3. Andrews, J., Zhang, Y.: General test result checking with log file analysis. IEEE Trans. Softw. Eng. 29(7), 634–648 (2003)CrossRefGoogle Scholar
  4. Araujo, R., Vincenzi, A., Delebecque, F., Maldonado, J., Delamaro, M.: Devising mutant operators for dynamic systems models by applying the HAZOP study. In: Proceeding of the Sixth International Conference on Software Engineering Advances, IARIA, pp. 58–64 (2011)Google Scholar
  5. Bagge, A., Haveraaen, M.: Axiom-based transformations: optimisation and testing. Electron. Notes Theor. Comput. Sci. 238(5), 17–33 (2009)CrossRefGoogle Scholar
  6. Baresi, L., Young, M.: Test oracles. Technical report, University of Oregon (USA) (2001)Google Scholar
  7. Baresi, L., Bianculli, D., Guinea, S., Spoletini, P.: Keep It small, keep it real: efficient run-time verification of web service compositions. In: Proceedings of the IFIP Joint International Conference on Formal Techniques for Distributed Systems (29th FORTE / 11th FMOODS), Lecture Notes in Computer Science 5522, pp. 26–40. Springer, New York (2009)Google Scholar
  8. Blackburn, M., Busser, R.: T-Vec: a tool for developing critical systems. In: IEEE Proceedings of the Eleventh Annual Conference on Computer Assurance, pp. 237–249 (1996)Google Scholar
  9. Boden, L., Busser, R., Blackburn, M., Nauman, A.: Extending Simulink models with natural relations to improve automated model-based testing. In: IEEE Proceedings of the 29th Annual IEEE/NASA Software Engineering Workshop, pp. 325–332 (2005)Google Scholar
  10. Bogdanich, W.: FDA Toughens Process for Radiation Equipment. The New York Times, New York (2010)Google Scholar
  11. Branco, K., Pelizzoni, J., Oliveira Neris, L., Trindade, O., Osorio, F., Wolf, D.: Tiriba—a new approach of UAV based on model driven development and multiprocessors. In: IEEE Proceedings of the International Conference on Robotics and Automation, pp. 1–4 (2011)Google Scholar
  12. Brown, D., Roggio, R., Cross, J., McCreary, C.: An automated oracle for software testing. IEEE Trans. Reliab. 41(2), 272–280 (1992)CrossRefGoogle Scholar
  13. Denil, J., Kashif, H., Arafa, P., Vangheluwe: Instrumentation and preservation of extra-functional properties of Simulink models. In: Proceedings of the Symposium on Theory of Modeling and Simulation. Society for Computer Simulation International, pp. 47–54 (2015)Google Scholar
  14. Dwyer, M., Avrunin, G., Corbett, J.: Patterns in property specifications for finite-state verification. In: Proceedings of the 21st International Conference on Software Engineering, ACM, pp. 411–420 (1999)Google Scholar
  15. Edwards, S.: A framework for practical, automated black-box testing of component-based software. J. Softw. Testing Verif. Reliab. 11(2), 97–111 (2001)CrossRefGoogle Scholar
  16. Felder, M., Morzenti, A.: Validating real-time systems by history-checking TRIO specifications. ACM Trans. Softw. Eng. Methodol. 3(4), 308–339 (1994)CrossRefGoogle Scholar
  17. Gaudel, M-C.: Testing can be formal, too. In: Proceedings of the 6th International Joint Conference CAAP/FASE on Theory and Practice of Software Development, pp. 82–96. Springer, New York (1995)Google Scholar
  18. Ghezzi, C., Mandrioli, D., Morzenti, A.: TRIO: a logic language for executable specifications of real-time systems. J. Syst. Softw. 12(2), 107–123 (1990)CrossRefGoogle Scholar
  19. Grand, K., Reddy, V., Sasaki, G., Dillaber, E.: Large-scale modeling for embedded applications (2010). http://www.mathworks.com/tagteam/63037_Large-ScaleModelingforEmbeddedApplications
  20. Guttag, J.: Abstract data types and the development of data structures. Commun. ACM 20(6), 396–404 (1977)CrossRefMATHGoogle Scholar
  21. Lee, C., Friedman, J.: Requirements modeling and automated requirements-based test generation. SAE Int. J. Aerosp. 6(2), 607–615 (2013)CrossRefGoogle Scholar
  22. Lorenz, E.: Deterministic nonperiodic flow. AMS J. Atmos. Sci. 20(2), 130–141 (1963)CrossRefGoogle Scholar
  23. Manolache, L., Kourie, D.: Software testing using model programs. Softw. Pract. Exp. 31(13), 1211–1236 (2001)CrossRefMATHGoogle Scholar
  24. McDowell, C., Helmbold, D.: Debugging concurrent programs. ACM Comput. Surv. 21(4), 593–622 (1989)CrossRefGoogle Scholar
  25. Memon, A., Banerjee, I., Nagarajan, A.: What test oracle should i use for effective GUI testing? In: IEEE Proceedings of the 18th IEEE International Conference on Automated Software Engineering, pp. 164–173 (2003)Google Scholar
  26. Nadeem, A., Jaffar-ur Rehman, M.: TESTAF: a test automation framework for class testing using object-oriented formal specifications. J. Univers. Comput. Sci., 11(6), pp. 962–985 (2005)Google Scholar
  27. Nardi, P., Baresi, L., Delamaro, M.: Specifying automated oracles for simulink models. In: IEEE Proceedings of the 19th International Conference on Embedded and Real-Time Computing Systems and Applications, pp. 330–333 (2013)Google Scholar
  28. Nardi, P., Delamaro, M.: Test oracles associated with dynamical system models. Technical Report, Universidade de São Paulo/São Carlos - ICMC (2011)Google Scholar
  29. Philips, M.: Knight Shows How to Lose $440 Million in 30 Minutes. Bloomberg, New York (2012)Google Scholar
  30. Reactive Systems Inc., Reactis user’s guide (2015)Google Scholar
  31. Reicherdt, R., Glesner, S.: Slicing matlab simulink models. In: Proceedings of the 34th International Conference on Software Engineerinf, pp. 551–561. IEEE Press, New York (2012)Google Scholar
  32. Richardson, D., Aha, S., O’Malley, T.: Specification-based test oracles for reactive systems. In: Proceedings of the 14th International Conference on Software Engineering, ACM, pp. 105–118 (1992)Google Scholar
  33. Schmidt, A., Durak, U., Rasch, C., Pawletta, T.: Model-based testing approach for MATLAB/Simulink using system entity structure and experimental frames. In: Proceedings of the Symposium on Theory of Modeling and Simulation. Society for Computer Simulation International, pp. 69–76 (2015)Google Scholar
  34. Shailesh, S.: Ease of analysing large signal modeling. http://www.techsource-asia.com/edm/2011aug/ideas.html. Accessed August 2016
  35. Sweet, W.: Chernobyl’s stressful after-effects. IEEE Spectr. 33(11), 26–34 (1996)CrossRefGoogle Scholar
  36. The MathWorks Inc., Model verification blocks and the verification manager. http://www.mathworks.com/help/slvnv/ug/model-verification-blocks-and-the-verification-manager.html. Accessed August 2016
  37. The MathWorks Inc., Simulink design verifier. http://www.mathworks.com/products/sldesignverifier/. Accessed August 2016
  38. The MathWorks Inc., Simulink design verifier. http://www.mathworks.com/products/sldesignverifier/index.html. Accessed August 2016
  39. The MathWorks Inc., Simulink verification and validation. http://www.mathworks.com/products/simverification/index.html. Accessed August 2016
  40. Tu, D., Chen, R., Du, Z., Liu, Y.: A method of log file analysis for test oracle. In: IEEE Proceedings of the International Conference on Scalable Computing and Communications and of Eighth International Conference on Embedded Computing, pp. 351–354 (2009)Google Scholar
  41. Wang, X., Qi, Z., Li, S.: An optimized method for automatic test oracle generation from real-time specification. In: Proceedings of the 10th IEEE International Conference on Engineering of Complex Computer Systems (IEEE-CS), pp. 440–449 (2005)Google Scholar
  42. Weyuker, E.: On testing non-testable programs. Comput. J. 25(4), 465–470 (1982)CrossRefGoogle Scholar
  43. Zander-Nowicka, J., Schieferdecker, I., Farkas, T.: Derivation of executable test models from embedded system models using model driven architecture artefacts—automotive domain. In: Proceedings of the Dagstuhl Workshop on Model-Based Development of Embedded Systems, pp. 131–140 (2006)Google Scholar
  44. Zander-Nowicka, J.: Model-based testing of real-time embedded systems in the automotive domain. PhD thesis, Technical University Berlin (2008)Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Politecnico di Milano - Dipartimento di Elettronica, Informazione e BioingegneriaMilanItaly
  2. 2.Universidade de São Paulo - ICMCSão CarlosBrazil
  3. 3.Universidade Tecnológica Federal do Paraná - DACOMCornélio ProcópioBrazil

Personalised recommendations