Input-Output Conformance Simulation (iocos) for Model Based Testing

  • Carlos Gregorio-Rodríguez
  • Luis Llana
  • Rafael Martínez-Torres
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7892)


A new model based testing theory built on simulation semantics is presented. At the core of this theory there is an input-output conformance simulation relation (iocos). As a branching semantics iocos can naturally distinguish the context of local choices. We show iocos to be a finer relation than the classic ioco conformance relation. It turns out that iocos is a transitive relation and therefore it can be used both as a conformance relation and a refinement preorder. An alternative characterisation of iocos is provided in terms of testing semantics. Finally we present an algorithm that produces a test suite for any specification. The resulting test suite is sound and exhaustive for the given specification with respect to iocos.


Model Based Testing Input Output Conformance Simulation Formal Methods 


  1. 1.
    Abramsky, S.: Observational equivalence as a testing equivalence. Theoretical Computer Science 53(3), 225–241 (1987)MathSciNetzbMATHCrossRefGoogle Scholar
  2. 2.
    de Alfaro, L.: Game models for open systems. In: Dershowitz, N. (ed.) Verification: Theory and Practice. LNCS, vol. 2772, pp. 269–289. Springer, Heidelberg (2004)Google Scholar
  3. 3.
    Alur, R., Henzinger, T.A., Kupferman, O., Vardi, M.Y.: Alternating refinement relations. In: Sangiorgi, D., de Simone, R. (eds.) CONCUR 1998. LNCS, vol. 1466, pp. 163–178. Springer, Heidelberg (1998)CrossRefGoogle Scholar
  4. 4.
    Belinfante, A.: Jtorx: A tool for on-line model-driven test derivation and execution. In: Esparza, J., Majumdar, R. (eds.) TACAS 2010. LNCS, vol. 6015, pp. 266–270. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  5. 5.
    Chow, T.S.: Testing software design modeled by finite-state machines. IEEE Trans. Software Eng. 4(3), 178–187 (1978)zbMATHCrossRefGoogle Scholar
  6. 6.
    Feijs, L.M.G., Goga, N., Mauw, S., Tretmans, J.: Test selection, trace distance and heuristics. In: Schieferdecker, I., König, H., Wolisz, A. (eds.) TestCom. IFIP Conference Proceedings, vol. 210, pp. 267–282. Kluwer (2002)Google Scholar
  7. 7.
    de Frutos-Escrig, D., Gregorio-Rodríguez, C., Palomino, M.: On the unification of process semantics: Equational semantics. Electronic Notes in Theoretical Computer Science 249, 243–267 (2009)CrossRefGoogle Scholar
  8. 8.
    de Frutos-Escrig, D., Gregorio-Rodríguez, C.: (Bi)simulations up-to characterise process semantics. Information and Computation 207(2), 146–170 (2009)Google Scholar
  9. 9.
    van Glabbeek, R.J.: The Linear Time – Branching Time Spectrum I. In: Handbook of Process Algebra, pp. 3–99. Elsevier (2001)Google Scholar
  10. 10.
    van Glabbeek, R.J., Ploeger, B.: Correcting a space-efficient simulation algorithm. In: Gupta, A., Malik, S. (eds.) CAV 2008. LNCS, vol. 5123, pp. 517–529. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  11. 11.
    Hennessy, M.: Algebraic Theory of Processes. MIT Press (1988)Google Scholar
  12. 12.
    Hierons, R.M., Bowen, J.P., Harman, M. (eds.): Formal Methods and Testing, An Outcome of the FORTEST Network, Revised Selected Papers. LNCS, vol. 4949. Springer, Heidelberg (2008)Google Scholar
  13. 13.
    Milner, R.: An algebraic definition of simulation between programs. In: Proceedings 2nd Joint Conference on Artificial Intelligence, pp. 481–489. BCS (1971)Google Scholar
  14. 14.
    Milner, R.: Communication and Concurrency. Prentice Hall (1989)Google Scholar
  15. 15.
    Nicola, R.D., Hennessy, M.: Testing equivalences for processes. Theoretical Computer Science 34(1-2), 83–133 (1984), MathSciNetzbMATHCrossRefGoogle Scholar
  16. 16.
    Ranzato, F., Tapparo, F.: A new efficient simulation equivalence algorithm. In: LICS, pp. 171–180. IEEE Computer Society (2007)Google Scholar
  17. 17.
    Romero Hernández, D., de Frutos Escrig, D.: Defining distances for all process semantics. In: Giese, H., Rosu, G. (eds.) FORTE 2012 and FMOODS 2012. LNCS, vol. 7273, pp. 169–185. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  18. 18.
    Stokkink, G., Timmer, M., Stoelinga, M.: Talking quiescence: a rigorous theory that supports parallel composition, action hiding and determinisation. In: Petrenko, A.K., Schlingloff, H. (eds.) MBT. EPTCS, vol. 80, pp. 73–87 (2012)Google Scholar
  19. 19.
    Tretmans, J.: Test generation with inputs, outputs and repetitive quiescence. Software - Concepts and Tools 17(3), 103–120 (1996)zbMATHGoogle Scholar
  20. 20.
    Tretmans, J.: Model based testing with labelled transition systems. In: Hierons, et al. (eds.) [12], pp. 1–38Google Scholar
  21. 21.
    Tretmans, J., Brinksma, E.: Torx: Automated model-based testing. In: Hartman, A., Dussa-Ziegler, K. (eds.) First European Conference on Model-Driven Software Engineering, pp. 31–43 (December 2003),
  22. 22.
    Utting, M., Pretschner, A., Legeard, B.: A taxonomy of model-based testing approaches. Softw. Test., Verif. Reliab. 22(5), 297–312 (2012)CrossRefGoogle Scholar
  23. 23.
    Veanes, M., Bjørner, N.: Alternating simulation and ioco. In: Petrenko, A., Simão, A., Maldonado, J.C. (eds.) ICTSS 2010. LNCS, vol. 6435, pp. 47–62. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  24. 24.
    Veanes, M., Campbell, C., Grieskamp, W., Schulte, W., Tillmann, N., Nachmanson, L.: Model-based testing of object-oriented reactive systems with spec explorer. In: Hierons et al, [12], pp. 39–76Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2013

Authors and Affiliations

  • Carlos Gregorio-Rodríguez
    • 1
  • Luis Llana
    • 1
  • Rafael Martínez-Torres
    • 1
  1. 1.Departamento Sistemas Informáticos y ComputaciónUniversidad Complutense de MadridSpain

Personalised recommendations