Advertisement

LaTe, a Non-fully Deterministic Testing Language

  • Emmanuel Donin de Rosière
  • Claude Jard
  • Benoît Parreaux
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3964)

Abstract

This paper presents a case study which is the test of a voice-based service. To develop this application, we propose new functionalities for testing languages and a new language called LaTe that implements them.

With LaTe, one testing scenario can describe several different executions and the interpreter tries to find the execution that best fits with the real behavior of the System Under Testing (SUT).

We propose an operational semantics of these non-deterministic operators. Experimental results of the test of the voice-based service are also included.

Keywords

Speech Recognition Test Scenario System Under Test Speech Recognition System Testing Language 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Thomas, D.E., Moorby, P.R.: The Verilog Hardware Description Language, 3rd edn. Kluwer Academic Publishers, Dordrecht (1996)CrossRefMATHGoogle Scholar
  2. 2.
    Offerman, A., Goor, A.: An experimental user level implementation of tcp. Technical Report 1-68340-44(1997)07, Delft University of Technology (1997)Google Scholar
  3. 3.
    Aho, A.V., Dahbura, A.T., Lee, D., Uyar, M.U.: An optimization technique for protocol conformance test generation based on UIO sequences and rural chinese postman tours. IEEE Transactions on Communications 39, 1604–1615 (1991)CrossRefGoogle Scholar
  4. 4.
    Doong, R.K., Frankl, P.G.: The ASTOOT approach to testing object-oriented programs. ACM Transactions on Software Engineering and Methodology 3, 101–130 (1994)CrossRefGoogle Scholar
  5. 5.
    Cheon, Y., Leavens, G.T.: A Simple and Practical Approach to Unit Testing: The JML and JUnit Way. In: Magnusson, B. (ed.) ECOOP 2002. LNCS, vol. 2374, p. 231. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  6. 6.
    Beck, K., Gamma, E.: Junit test infected: Programmers love writing tests. Technical report, Java Report (1998)Google Scholar
  7. 7.
    Massol, V., Husted, T.: JUnit In Action. Manning (2003)Google Scholar
  8. 8.
    ITU-T Z.140: The Tree and Tabular Combined Notation Version 3 (TTCN-3): Core Language (2001)Google Scholar
  9. 9.
    Pickin, S., Jard, C., Le Traon, Y., Jézéquel, J., Le Guennec, A.: System test synthesis from uml models of distributed software. In: FORTE 2002, IFIP Int. Conf. on Formal description techniques, Houston, Texas (2002)Google Scholar
  10. 10.
    Ghriga, M., Frankl, P.G.: Adaptive testing of non-deterministic communication protocols. In: Protocol Test Systems, pp. 347–362 (1993)Google Scholar
  11. 11.
    Fernandez, J.C., Jard, C., Jéron, T., Viho, G.: Using on-the-fly verification techniques for the generation of test suites. In: Alur, R., Henzinger, T.A. (eds.) CAV 1996. LNCS, vol. 1102, pp. 348–359. Springer, Heidelberg (1996)CrossRefGoogle Scholar
  12. 12.
    Tock, L.P.: The bcg postscript format. Technical report, INRIA Rhône-Alpes (1995)Google Scholar
  13. 13.
    Fernandez, J.C.: Aldebaran user’s manual. Technical report, Laboratoire de Génie Informatique - Institut IMAG (1989)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2006

Authors and Affiliations

  • Emmanuel Donin de Rosière
    • 1
  • Claude Jard
    • 2
  • Benoît Parreaux
    • 1
  1. 1.France Télécom R&DLannionFrance
  2. 2.ENS CachanBruzFrance

Personalised recommendations