Skip to main content

Combining usage-based and model-based testing for service-oriented architectures in the industrial practice

Abstract

Usage-based testing focuses quality assurance on highly used parts of the software. The basis for this are usage profiles based on which test cases are generated. There are two fundamental approaches in usage-based testing for deriving usage profiles: either the system under test (SUT) is observed during its operation and from the obtained usage data a usage profile is automatically inferred, or a usage profile is modeled by hand within a model-based testing (MBT) approach. In this article, we propose a third and combined approach, where we automatically infer a usage profile and create a test data repository from usage data. Then, we create representations of the generated tests and test data in the test model from an MBT approach. The test model enables us to generate executable Testing and Test Control Notation version 3 (TTCN-3) and thereby allows us to automate the test execution. Together with industrial partners, we adopted this approach in two pilot studies. Our findings show that usage-based testing can be applied in practice and greatly helps with the automation of tests. Moreover, we found that even if usage-based testing is not of interest, the incorporation of usage data can ease the application of MBT.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Notes

  1. E.g., JAX-WS (https://jax-ws.java.net/) creates for each operation an input action and an output action. The input action gets the suffix Request and the output action gets the suffix Response.

  2. Please note that this does not have to be the case in synchronous communication if a service internally sends a request to another service before responding to the client.

References

  1. Dedalus. http://www.dedalus.eu/

  2. Health Level Seven International. http://www.hl7.org.uk/

  3. Itainnova-instituto tecnológico de aragón. http://www.itainnova.es/

  4. Jacareto. http://sourceforge.net/projects/jacareto/

  5. Object management group (omg). http://www.omg.org

  6. Selenium webdriver. http://www.seleniumhq.org/

  7. ALL4TEC: MaTeLo. http://www.all4tec.net/index.php/en/model-based-testing/20-markov-test-logic-matelo(linkcheckedJune2nd,2014)

  8. Baker, P., Dai, Z.R., Grabowski, J., Haugen, O., Schieferdecker, I., Williams, C.: Model-driven testing: using the uml testing profile. Springer-Verlag New York Inc, Secaucus (2007)

    Google Scholar 

  9. Barcelona Liédana, M.A., López-Nicolás, G., García-Borgon̋ón, L.: Practical experiences in the usage of midas in the logistics domain. Software Tools for Technology Transfer (STTT), accepted (2016)

  10. Chen, C., Zaidman, A., Gross, H.G.: A framework-based runtime monitoring approach for service-oriented software systems. In: Proceedings of the International Workshop on Quality Assurance for Service-Based Applications, QASBA ’11, pp. 17–20. ACM, New York, NY, USA (2011). doi:10.1145/2031746.2031752

  11. Cheung, R.C.: A user-oriented software reliability model. IEEE Trans. Softw. Eng. 6(2), 118–125 (1980). doi:10.1109/TSE.1980.234477

    Article  MATH  Google Scholar 

  12. Cover, T.M., Thomas, J.A.: Elements of information theory, 2nd edn. Wiley, Hoboken (2006)

    MATH  Google Scholar 

  13. De Francesco, A., Di Napoli, C., Giordano, M., Ottaviano, G., Perego, R., Tonellotto, N.: A soa testing platform on the cloud: The midas experience. In: Intelligent Networking and Collaborative Systems (INCoS), 2014 International Conference on, pp. 659–664 (2014). doi:10.1109/INCoS.2014.62

  14. Di Napoli, C., De Francesco, A., Giordano, M., Ottaviano, G., Tonellotto, N., Perego, R.: Midas: a cloud platform for soa testing as a service. International Journal of High Performance Computing and Networking (2015) (in press)

  15. Dulz, W., Zhen, F.: MaTeLo—statistical usage testing by annotated sequence diagrams, Markov Chains and TTCN-3. In: Proceedings of the 3rd International Conference on Quality Software (QSIC) (2003)

  16. Eclipse Foundation: Papyrus. https://eclipse.org/papyrus/

  17. Feliachi, A., Le Guen, H.: Generating transition probabilities for automatic model-based test generation. In: Proceedings of the 3rd International Conference on Software Testing, Verification and Validation (ICST) (2010). doi:10.1109/ICST.2010.26

  18. Feller, W.: An introduction to probability theory and its applications. Wiley, Hoboken (1971)

    MATH  Google Scholar 

  19. Grabowski, J., Hogrefe, D., Réthy, G., Schieferdecker, I., Wiles, A., Willcock, C.: An introduction to the testing and test control notation (TTCN-3). Comput. Netw. 42(3), 375–403 (2003). doi:10.1016/S1389-1286(03)00249-4

    Article  MATH  Google Scholar 

  20. GS1: Logistics interoperability model version 1. http://www.gs1.org/lim (2007)

  21. Health Level Seven International: Hl7 version 3 standard: Identification service (is), release 1. http://www.hl7.org/implement/standards/product_brief.cfm?product_id=87 (2014)

  22. Healthcare Service Specification Project (HSSP): Hssp specifications. https://hssp.wikispaces.com/specs

  23. Herbold, S., Bünting, U., Grabowski, J., Waack, S.: Deployable capture/replay supported by internal messages. Adv. Comput. 85, 327–367 (2012)

    Article  Google Scholar 

  24. Herbold, S., Grabowski, J., Waack, S.: A Model for Usage-based Testing of Event-driven Software. In: 3rd International Workshop on Model-Based Verification & Validation From Research to Practice. IEEE Computer Society (2011)

  25. Herbold, S., Harms, P.: AutoQUEST—automated quality engineering of event-driven software. In: Proceedings of the IEEE 6th International Conference on Software Testing, Verification and Validation Workshops (ICSTW) (2013). doi:10.1109/ICSTW.2013.23

  26. International Software Testing Qualitifications Board (ISTQB): Standard glossary of terms used in Software Testing, Version 2.1 (2010)

  27. Kosala, R., Blockeel, H.: Web mining research: a survey. ACM SIGKDD Explor. Newsl. 2(1), 1–15 (2000). doi:10.1145/360402.360406

    Article  Google Scholar 

  28. Le Guen, H., Marie, R., Thelin, T.: Reliability estimation for statistical usage testing using markov chains. In: Proceedings of the 15th International Symposium on Software Reliability Engineering (ISSRE) (2004). doi:10.1109/ISSRE.2004.33

  29. Littlewood, B.: A reliability model for systems with Markov structure. J. R. Stat. Soc. Ser. C (Applied Statistics) 24(2), 172–177 (1975)

  30. MIDAS Consortium: Model and Inference Driven Automated testing of Servicesarchitectures (MIDAS). http://www.midas-project.eu (link checked June 2nd, 2014)

  31. Motahari-Nezhad, H.R., Saint-Paul, R., Casati, F., Benatallah, B.: Event correlation for process discovery from web service interaction logs. VLDB J. 20(3), 417–444 (2011). doi:10.1007/s00778-010-0203-9

    Article  Google Scholar 

  32. Object Management Group (OMG): Retrieve, locate, and update service (rlus). http://www.omg.org/spec/RLUS/ (2011)

  33. Rumbaugh, J., Jacobson, I., Booch, G.: Unified modeling language reference manual, the (2nd edition). Pearson Higher Education, New York (2004)

    Google Scholar 

  34. Srivastava, J., Cooley, R., Deshpande, M., Tan, P.N.: Web usage mining: discovery and applications of usage patterns from Web data. ACM SIGKDD Explor. Newsl. 1(2), 12–23 (2000). doi:10.1145/846183.846188

    Article  Google Scholar 

  35. Testing Technologies: Ttworkbench. http://www.testingtech.com/products/ttworkbench.php

  36. Tonella, P., Ricca, F.: Dynamic model extraction and statistical analysis of web applications. In: Proceedings of the 4th International Workshop on Web Site Evolution (WSE) (2002)

  37. Tonella, P., Ricca, F.: Statistical testing of web applications. J. Softw. Maint. Evol. Res. Pract. 16(1–2), 103–127 (2004). doi:10.1002/smr.284

    Article  MATH  Google Scholar 

  38. Tonella, P., Ricca, F.: Dynamic Model extraction and statistical analysis of web applications: Follow-up after 6 years. In: Proceedings of the 10th International Symposium on Web Site Evolution (WSE) (2008)

  39. Tonella, P., Tiella, R., Nguyen, C.D.: Interpolated n-grams for model based testing. In: Proceedings of the 36th International Conference on Software Engineering, ICSE 2014, pp. 562–572. ACM, New York, NY, USA (2014). doi:10.1145/2568225.2568242

  40. Walton, G.H., Poore, J.H., Trammell, C.J.: Statistical testing ofsoftware based on a usage model. Softw. Pract. Ant Exp. 25(1), 97–108 (1995). doi:10.1002/spe.4380250106

    Article  Google Scholar 

  41. Wendland, M.F., Schneider, M., Hoffmann, A.: A model-driven approach to test automation for soa systems. Software Tools for Technology Transer (submitted) (2015)

  42. Wesslén, A., Wohlin, C.: Modelling and generation of software usage. In: Proceedings of the 5th International Conference on Software Quality (1995)

  43. Whittaker, J.A., Poore, J.H.: Markov analysis of software specifications. ACM Trans. Softw. Eng. Methodol. 2(1), 93–106 (1993). doi:10.1145/151299.151326

    Article  Google Scholar 

  44. Whittaker, J.A., Thomason, M.G.: A Markov chain model for statistical software testing. IEEE Trans. Softw. Eng. 20(10), 812–824 (1994). doi:10.1109/32.328991

    Article  Google Scholar 

  45. Woit, D.M.: Specifying operational profiles for modules. SIGSOFT Softw. Eng. Notes 18(3), 2–10 (1993). doi:10.1145/174146.154187

    Article  Google Scholar 

  46. Woit, D.M.: Conditional-event usage testing. In: Proceedings of the 1998 conference of the Centre for Advanced Studies on Collaborative research, CASCON ’98, p. 23. IBM Press (1998)

  47. World Wide Web Consortium (W3C): Web services addressing (ws-addressing). http://www.w3.org/Submission/ws-addressing/ (2004)

  48. Yufang Dan Nicolas Stouls, S.F.C.C.: A Monitoring approach for dynamic service-oriented architecture systems. In: SERVICE COMPUTATION 2012: The Fourth International Conferences on Advanced Service Computing, pp. 20–23. XPS (Xpert Publishing Services) (2012)

Download references

Acknowledgments

This work was done in the context of the “Model and Inference Driven-Automated testing of Services architectures” (MIDAS) European project (project number 318786). We would like to thank Testing Technologies for their support in terms of licensing as well as feedback to support requests regarding TTworkbench; Fraunhofer FOKUS for the creation and maintenance of the MIDAS DSL and TTCN-3 generation; and our pilot partners from ITAINNOVA and Dedalus S.p.A. for their support in conducting the pilot studies.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Steffen Herbold.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Herbold, S., Harms, P. & Grabowski, J. Combining usage-based and model-based testing for service-oriented architectures in the industrial practice. Int J Softw Tools Technol Transfer 19, 309–324 (2017). https://doi.org/10.1007/s10009-016-0437-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10009-016-0437-y

Keywords

  • Usage-based testing
  • Model-based testing
  • Usage monitoring
  • Web service testing
  • TTCN-3