Advertisement

Component-aware Input-Output Conformance

  • Alexander Graf-BrillEmail author
  • Holger Hermanns
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11535)

Abstract

Black-box conformance testing based on a compositional model of the intended behaviour is a very attractive approach to validate the correctness of an implementation. In this context, input-output conformance is a scientifically well-established formalisation of the testing process. This paper discusses peculiar problems arising in situations where the implementation is a monolithic black box, for instance for reasons of intellectual property restrictions, while the specification is compositional. In essence, tests need to be enabled to observe progress in individual specification-level components. For that, we will reconsider input-output conformance so that it can faithfully deal with such situations. Refined notions of quiescence play a central role in a proper treatment of the problem. We focus on the scenario of parallel components with fully asynchronous communication covering very many notorious practical examples. We finally illustrate the practical implications of component-aware conformance testing in the context of a prominent example, namely networked embedded software.

Keywords

Model-based testing Input-output conformance Compositionality 

Notes

Acknowledgements

This work has received financial support by the ERC Advanced Investigators Grant 695614 (POWVER) and by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) grant 389792660 as part of TRR 248, see https://perspicuous-computing.science.

References

  1. 1.
    Benes, N., Daca, P., Henzinger, T.A., Kretínský, J., Nickovic, D.: Complete composition operators for IOCO-testing theory. In: Proceedings of the 18th International ACM SIGSOFT Symposium on Component-Based Software Engineering, CBSE 2015, Montreal, QC, Canada, 4–8 May 2015, pp. 101–110 (2015).  https://doi.org/10.1145/2737166.2737175
  2. 2.
    van der Bijl, M., Rensink, A., Tretmans, J.: Compositional testing with ioco. In: Petrenko, A., Ulrich, A. (eds.) FATES 2003. LNCS, vol. 2931, pp. 86–100. Springer, Heidelberg (2004).  https://doi.org/10.1007/978-3-540-24617-6_7CrossRefzbMATHGoogle Scholar
  3. 3.
    Braspenning, N.C.W.M., van de Mortel-Fronczak, J.M., Rooda, J.E.: A model-based integration and testing method to reduce system development effort. Electron. Notes Theor. Comput. Sci. 164(4), 13–28 (2006).  https://doi.org/10.1016/j.entcs.2006.09.003CrossRefGoogle Scholar
  4. 4.
    CAN in Automation International Users and Manufacturers Group e.V.: CiA 301 CANopen Application Layer and Comm. Profile, v. 4.2.0, February 2011Google Scholar
  5. 5.
    CAN in Automation International Users and Manufacturers Group e.V., EnergyBus e.V.: CiA 454 Draft Standard Proposal Application profile for energy management systems - doc. series 1–14, v. 2.0.0, June 2014Google Scholar
  6. 6.
    Carver, R., Lei, Y.: A modular approach to model-based testing of concurrent programs. In: Lourenço, J.M., Farchi, E. (eds.) MUSEPAT 2013. LNCS, vol. 8063, pp. 85–96. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-39955-8_8CrossRefGoogle Scholar
  7. 7.
    Daca, P., Henzinger, T.A., Krenn, W., Nickovic, D.: Compositional specifications for IOCO testing. In: Seventh IEEE International Conference on Software Testing, Verification and Validation, ICST 2014, Cleveland, Ohio, USA, 31 March–4 April 2014, pp. 373–382. IEEE Computer Society (2014).  https://doi.org/10.1109/ICST.2014.50
  8. 8.
    De Nicola, R.: Extensional equivalences for transition systems. Acta Inf. 24(2), 211–237 (1987).  https://doi.org/10.1007/BF00264365MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    De Nicola, R., Hennessy, M.: Testing equivalences for processes. Theor. Comput. Sci. 34, 83–133 (1984).  https://doi.org/10.1016/0304-3975(84)90113-0MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Garavel, H., Lang, F., Mateescu, R.: Compositional verification of asynchronousconcurrent systems using CADP. Acta Inf. 52(4–5), 337–392 (2015).  https://doi.org/10.1007/s00236-015-0226-1CrossRefzbMATHGoogle Scholar
  11. 11.
    Gotzhein, R., Khendek, F.: Compositional testing of communication systems. In: Uyar, M.Ü., Duale, A.Y., Fecko, M.A. (eds.) TestCom 2006. LNCS, vol. 3964, pp. 227–244. Springer, Heidelberg (2006).  https://doi.org/10.1007/11754008_15CrossRefzbMATHGoogle Scholar
  12. 12.
    Graf, S., Steffen, B., Lüttgen, G.: Compositional minimisation of finitestate systems using interface specifications. Formal Aspects Comput. 8(5), 607–616 (1996).  https://doi.org/10.1007/BF01211911CrossRefzbMATHGoogle Scholar
  13. 13.
    Graf-Brill, A., Hartmanns, A., Hermanns, H., Rose, S.: Modelling and certification for electric mobility. In: 15th IEEE International Conference on Industrial Informatics, INDIN 2017, Emden, Germany, 24–26 July 2017, pp. 109–114. IEEE (2017).  https://doi.org/10.1109/INDIN.2017.8104755
  14. 14.
    Graf-Brill, A., Hermanns, H.: Model-based testing for asynchronous systems. In: Petrucci, L., Seceleanu, C., Cavalcanti, A. (eds.) FMICS/AVoCS -2017. LNCS, vol. 10471, pp. 66–82. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-67113-0_5CrossRefGoogle Scholar
  15. 15.
    Graf-Brill, A., Hermanns, H., Garavel, H.: A model-based certification framework for the EnergyBus standard. In: Ábrahám, E., Palamidessi, C. (eds.) FORTE 2014. LNCS, vol. 8461, pp. 84–99. Springer, Heidelberg (2014).  https://doi.org/10.1007/978-3-662-43613-4_6CrossRefGoogle Scholar
  16. 16.
    Heerink, L.: Ins and Outs in refusal testing. Ph.D. thesis, University of Twente, Enschede, Netherlands (1998)Google Scholar
  17. 17.
    Henzinger, T.A., Qadeer, S., Rajamani, S.K.: You assume, we guarantee: methodology and case studies. In: Hu, A.J., Vardi, M.Y. (eds.) CAV 1998. LNCS, vol. 1427, pp. 440–451. Springer, Heidelberg (1998).  https://doi.org/10.1007/BFb0028765CrossRefGoogle Scholar
  18. 18.
    Janssen, R., Tretmans, J.: Matching implementations to specifications: the corner cases of IOCO. In: ACM/SIGAPP Symp. on Applied Computing - Software Verification and Testing Track, pp. 2196–2205. ACM, USA (2019). https://sumbat.cs.ru.nl/Publications
  19. 19.
    Langerak, R.: A testing theory for LOTOS using Deadlock detection. In: Brinksma, E., Scollo, G., Vissers, C.A. (eds.) Protocol Specification, Testing and Verification IX, Proceedings of the IFIP WG6.1 Ninth International Symposium on Protocol Specification, Testing and Verification, Enschede, The Netherlands, 6–9 June, 1989, pp. 87–98. North-Holland (1989)Google Scholar
  20. 20.
    Marsso, L., Mateescu, R., Serwe, W.: TESTOR: a modular tool for On-the-Fly conformance test case generation. In: Beyer, D., Huisman, M. (eds.) TACAS 2018. LNCS, vol. 10806, pp. 211–228. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-89963-3_13CrossRefGoogle Scholar
  21. 21.
    Noroozi, N., Mousavi, M.R., Willemse, T.A.C.: Decomposability in input output conformance testing. In: Proceedings Eighth Workshop on Model-Based Testing, MBT 2013, Rome, Italy, 17th March 2013, pp. 51–66 (2013).  https://doi.org/10.4204/EPTCS.111.5CrossRefGoogle Scholar
  22. 22.
    Phillips, I.: Refusal testing. Theor. Comput. Sci. 50, 241–284 (1987).  https://doi.org/10.1016/0304-3975(87)90117-4MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    Tretmans, J.: Model based testing with labelled transition systems. In: Hierons, R.M., Bowen, J.P., Harman, M. (eds.) Formal Methods and Testing. LNCS, vol. 4949, pp. 1–38. Springer, Heidelberg (2008).  https://doi.org/10.1007/978-3-540-78917-8_1CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2019

Authors and Affiliations

  1. 1.Saarland UniversitySaarbrückenGermany
  2. 2.Institute of Intelligent SoftwareGuangzhouChina

Personalised recommendations