Require, Test and Trace IT

  • Bernhard K. Aichernig
  • Klaus Hörmaier
  • Florian Lorber
  • Dejan Ničković
  • Stefan Tiran
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9128)

Abstract

We propose a framework for requirement-driven test generation that combines contract-based interface theories with model-based testing. We design a specification language, requirement interfaces, for formalizing different views (aspects) of synchronous data-flow systems from informal requirements. Multiple views of a system, modeled as requirement interfaces, are naturally combined by conjunction.

We develop an incremental test generation procedure with several advantages. The test generation is driven by a single requirement interface at a time. It follows that each test assesses a specific aspect or feature of the system, specified by its associated requirement interface. Since we do not explicitly compute the conjunction of all requirement interfaces of the system, we avoid state space explosion while generating tests. However, we incrementally complete a test for a specific feature with the constraints defined by other requirement interfaces. This allows catching violations of any other requirement during test execution, and not only of the one used to generate the test. Finally, this framework defines a natural association between informal requirements, their formal specifications and the generated tests, thus facilitating traceability. We implemented a prototype test generation tool and we demonstrate its applicability on an industrial use case.

Keywords

Model-based testing Test-case generation Requirements engineering Traceability Requirement interfaces Formal specification Synchronous systems Consistency checking Incremental test-case generation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aichernig, B.K., Auer, J., Jöbstl, E., Korošec, R., Krenn, W., Schlick, R., Schmidt, B.V.: Model-based mutation testing of an industrial measurement device. In: Seidl, M., Tillmann, N. (eds.) TAP 2014. LNCS, vol. 8570, pp. 1–19. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  2. 2.
    Bernhard, K.A., Hörmaier, K., Lorber, F., Ničković, D., Schlick, R., Simoneau, D., Tiran, S.: Integration of Requirements Engineering and Test-Case Generation via OSLC. In: QSIC, pp. 117–126 (2014)Google Scholar
  3. 3.
    Aichernig, B.K., Lorber, F., Ničković, D.: Time for mutants — model-based mutation testing with timed automata. In: Veanes, M., Viganò, L. (eds.) TAP 2013. LNCS, vol. 7942, pp. 20–38. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  4. 4.
    Aichernig, B.K., Lorber, F., Ničković, D., Tiran, S.: Require, test and trace it. Technical Report IST-MBT-2014-03, Graz University of Technology, Institute for Software Technology (2014), https://online.tugraz.at/tug_online/voe_main2.getVollText?pDocumentNr=637834&pCurrPk=77579
  5. 5.
    Aiguier, M., Boulanger, F., Kanso, B.: A formal abstract framework for modelling and testing complex software systems. Theor. Comput. Sci. 455, 66–97 (2012)CrossRefMATHMathSciNetGoogle Scholar
  6. 6.
    Benveniste, A., Caillaud, B., Ferrari, A., Mangeruca, L., Passerone, R., Sofronis, C.: Multiple viewpoint contract-based specification and design. In: de Boer, F.S., Bonsangue, M.M., Graf, S., de Roever, W.-P. (eds.) FMCO 2007. LNCS, vol. 5382, pp. 200–225. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  7. 7.
    Benveniste, A., Caspi, P., Le Guernic, P., Halbwachs, N.: Data-flow synchronous languages. In: de Bakker, J.W., de Roever, W.-P., Rozenberg, G. (eds.) REX 1993. LNCS, vol. 803, pp. 1–45. Springer, Heidelberg (1994)CrossRefGoogle Scholar
  8. 8.
    Caspi, P., Pilaud, D., Halbwachs, N., Plaice, J.: Lustre: A declarative language for programming synchronous systems. In: POPL, pp. 178–188. ACM Press (1987)Google Scholar
  9. 9.
    Chakrabarti, A., de Alfaro, L., Henzinger, T.A., Mang, F.Y.C.: Synchronous and bidirectional component interfaces. In: Brinksma, E., Larsen, K.G. (eds.) CAV 2002. LNCS, vol. 2404, pp. 414–427. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  10. 10.
    Daca, P., Henzinger, T.A., Krenn, W., Ničković, D.: Compositional specifications for ioco testing: Technical report. Technical report, IST Austria (2014), http://repository.ist.ac.at/152/
  11. 11.
    Doyen, L., Henzinger, T.A., Jobstmann, B., Petrov, T.: Interface theories with component reuse. In: EMSOFT, pp. 79–88. ACM (2008)Google Scholar
  12. 12.
    Ellen, C., Sieverding, S., Hungar, H.: Detecting consistencies and inconsistencies of pattern-based functional requirements. In: Lang, F., Flammini, F. (eds.) FMICS 2014. LNCS, vol. 8718, pp. 155–169. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  13. 13.
    Gautier, T., Le Guernic, P.: Signal: A declarative language for synchronous programming of real-time systems. In: Kahn, G. (ed.) FPCA 1987. LNCS, vol. 274, pp. 257–277. Springer, Heidelberg (1987)CrossRefGoogle Scholar
  14. 14.
    Hamon, G., De Moura, L., Rushby, J.: Automated test generation with sal. CSL Technical Note (2005)Google Scholar
  15. 15.
    Henzinger, T.A., Ničković, D.: Independent implementability of viewpoints. In: Calinescu, R., Garlan, D. (eds.) Monterey Workshop 2012. LNCS, vol. 7539, pp. 380–395. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  16. 16.
    ISO. ISO/DIS 26262-1 - Road vehicles - Functional safety - Part 1 Glossary. Technical report, International Organization for Standardization / Technical Committee 22 (ISO/TC 22), Geneva, Switzerland (July 2009)Google Scholar
  17. 17.
    Junker, U.: Quickxplain: Preferred explanations and relaxations for over-constrained problems. In: AAAI, pp. 167–172. AAAI Press (2004)Google Scholar
  18. 18.
    Krichen, M., Tripakis, S.: Conformance testing for real-time systems. Formal Methods in System Design 34(3), 238–304 (2009)CrossRefMATHGoogle Scholar
  19. 19.
    Papailiopoulou, V.: Automatic test generation for lustre/scade programs. In: ASE, pp. 517–520. IEEE Computer Society, Washington, DC (2008)Google Scholar
  20. 20.
    Raymond, P., Nicollin, X., Halbwachs, N., Weber, D.: Automatic testing of reactive systems. In: RTSS, pp. 200–209. IEEE Computer Society (1998)Google Scholar
  21. 21.
    Reineke, J., Tripakis, S.: Basic problems in multi-view modeling. Technical Report UCB/EECS-2014-4, EECS Department, University of California, Berkeley (January 2014)Google Scholar
  22. 22.
    Sampaio, A., Nogueira, S., Mota, A.: Compositional verification of input-output conformance via csp refinement checking. In: Breitman, K., Cavalcanti, A. (eds.) ICFEM 2009. LNCS, vol. 5885, pp. 20–48. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  23. 23.
    Michael Spivey, J.: Z Notation - a reference manual, 2nd edn. Prentice Hall International Series in Computer Science. Prentice Hall (1992)Google Scholar
  24. 24.
    Tretmans, J.: Test generation with inputs, outputs and repetitive quiescence. Software - Concepts and Tools 17(3), 103–120 (1996)MATHGoogle Scholar
  25. 25.
    van der Bijl, M., Rensink, A., Tretmans, J.: Compositional testing with ioco. In: Petrenko, A., Ulrich, A. (eds.) FATES 2003. LNCS, vol. 2931, pp. 86–100. Springer, Heidelberg (2004)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Bernhard K. Aichernig
    • 1
  • Klaus Hörmaier
    • 2
  • Florian Lorber
    • 1
  • Dejan Ničković
    • 3
  • Stefan Tiran
    • 1
    • 3
  1. 1.Graz University of TechnologyGrazAustria
  2. 2.Infineon Technologies Austria AGVillachAustria
  3. 3.AIT Austrian Institute of TechnologyViennaAustria

Personalised recommendations