Skip to main content
Log in

Automation and intelligent scheduling of distributed system functional testing

Model-based functional testing in practice

  • MBTCloud
  • Published:
International Journal on Software Tools for Technology Transfer Aims and scope Submit manuscript

Abstract

This paper presents the approach to functional test automation of services (black-box testing) and service architectures (grey-box testing) that has been developed within the MIDAS project and is accessible on the MIDAS SaaS. In particular, the algorithms and techniques adopted for addressing input and oracle generation, dynamic scheduling, and session planning issues supporting service functional test automation are illustrated. More specifically, the paper details: (i) the test input generation based on formal methods and temporal logic specifications, (ii) the test oracle generation based on service formal specifications, (iii) the dynamic scheduling of test cases based on probabilistic graphical reasoning, and (iv) the reactive, evidence-based planning of test sessions with on-the-fly generation of new test cases. Finally, the utilisation of the MIDAS prototype for the functional test of operational services and service architectures in the healthcare industry is reported and assessed. A planned evolution of the technology deals with the testing and troubleshooting of distributed systems that integrate connected objects.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

Similar content being viewed by others

References

  1. http://martinfowler.com/bliki/DeploymentPipeline.html

  2. https://jenkins-ci.org

  3. http://www.midas-project.eu

  4. http://www.w3.org/TR/wsdl

  5. https://www.soapui.org

  6. http://www.dedalus.eu

  7. https://hssp.wikispaces.com

  8. https://en.wikipedia.org/wiki/Service_Component_Architecture

  9. http://www.w3.org/TR/wsdl20

  10. http://swagger.io

  11. http://www.w3.org/TR/scxml

  12. https://www.w3.org/TR/xpath

  13. http://www.dedalus.eu/x1v1.cfm?chg_lang=eng

  14. http://blog.simplytestify.com

  15. IBM Rational Service Tester for SOA Quality: Functional testing. http://www-03.ibm.com/software/products/fr/servicetest

  16. Parasoft: Api testing, service virtualisation, test environment and data management. https://www.parasoft.com

  17. Soasta: Load and performance testing. https://www.soasta.com

  18. Tricentis: Risk-based testing, model-based test automation and test data management. http://www.tricentis.com

  19. Anand, S., Edmund, K.B., Tsong, Y.C., John, A.C., Myra, B.C., Wolfgang, G., Mark, H., Mary, J.H., Phil, M.: An orchestrated survey of methodologies for automated software test case generation. J. Syst. Softw. 86(8), 1978–2001 (2013)

    Article  Google Scholar 

  20. Paolo, A., Angelo, G., Elvinia, R.: Optimizing the automatic test generation by SAT and SMT solving for boolean expressions. In: Proceedings of the 2011 26th IEEE/ACM International Conference on Automated Software Engineering, ASE ’11, pp. 388–391, Washington, DC. IEEE Computer Society (2011)

  21. Askarunisa, A., Punitha, K.A.J., Abirami, A.M., Black box test case prioritization techniques for semantic based composite web services using OWL-S. In: Recent Trends in Information Technology (ICRTIT), 2011 International Conference, pp. 1215–1220. IEEE (2011)

  22. Athira, B., Samuel, P.: Web services regression test case prioritization. In: Computer Information Systems and Industrial Management Applications (CISIM), 2010 International Conference, pp. 438–443. IEEE (2010)

  23. Barcelona, M.A., García-Borgoñón, L., López-Nicolás, G.: Practical experiences in the usage of MIDAS in the logistics domain. Int. J. Softw. Tools Technol. Transf. (2016). doi:10.1007/s10009-016-0430-5

  24. Barr, E.T., Harman, M., McMinn, P., Shahbaz, M., Yoo, S.: The oracle problem in software testing: a survey. Softw. Eng. IEEE Trans. 41(5), 507–525 (2015)

  25. Bartolini, C., Bertolino, A., Marchetti, E., Polini, A.: WS-TAXI: A WSDL-based testing tool for web services. In: Second International Conference on Software Testing Verification and Validation, ICST 2009, Denver, Colorado, USA, April 1–4, 2009, pp. 326–335. IEEE Computer Society (2009)

  26. Bentakouk, L., Poizat, P., Zaïdi, F.: Checking the behavioral conformance of web services with symbolic testing and an SMT solver. In: TAP, Lecture Notes in Computer Science, vol. 6706, pp. 33–50. Springer (2011)

  27. Bozkurt, M., Mark, H., Youssef, H.: Testing and verification in service-oriented architecture: a survey. Softw. Test. Verif. Reliab. 23(4), 261–313 (2013)

    Article  Google Scholar 

  28. Cao, T. D., Felix, P., Castanet, R., Berrada, I.: Online testing framework for web services. In: 2010 Third International Conference on Software Testing, Verification and Validation, pp. 363–372 (2010)

  29. Chan, A.: Encyclopedia of database systems, chapter service component architecture (SCA), pp. 2632–2633. Springer US, Boston (2009)

  30. Chen, L., Wang, Z., Xu, L., Lu, H., Xu, B.: Test case prioritization for web service regression testing. In: Service Oriented System Engineering (SOSE), 2010 Fifth IEEE International Symposium, pp. 173–178. IEEE (2010)

  31. Clarke, E.M., Grumberg, O., Peled, D.A.: Model checking, pp. 1–314. MIT Press (2001)

  32. Clarke, E.M., Klieber, W., Novácek, M., Zuliani, P.: Model checking and the state explosion problem. In: Meyer, B., Nordio, M. (eds.) Tools for Practical Software Verification, LASER, International Summer School 2011, Elba Island, Italy, Revised Tutorial Lectures, Lecture Notes in Computer Science, vol. 7682, pp. 1–30. Springer (2011)

  33. Conforti, D., Groccia, M.C., Corasaniti, B., Guido, R., Iannacchero, R.: EHMTI-0172. Calabria Cephalalgic Network: innovative services and systems for the integrated clinical management of headache patients. J. Headache Pain 15(Suppl 1), D12 (2014)

    Article  Google Scholar 

  34. Console, L., Fugini, M.: WS-DIAMOND: an approach to web services–DIAgnosability, MONitoring and Diagnosis, Information and Communication Technologies and the Knowledge Economy, vol. 4. IOS Press, Amsterdam (2007)

  35. de Kleer, J., Williams, B.C.: Diagnosing multiple faults. Artif. Intell. 32(1), 97–130 (1987)

    Article  MATH  Google Scholar 

  36. Dechter, R.: Elimination, bucket: a unifying framework for processing hard and soft constraints. Constraints 2(1), 51–55 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  37. ECMA International. Standard ECMA-262—ECMAScript Language Specification 5.1 Edition (2011). http://www.ecma-international.org/ecma-262/5.1/Ecma-262.pdf

  38. Elbaum, S., Malishevsky, A.G., Rothermel, G.: Test case prioritization: a family of empirical studies. Softw. Eng. IEEE Trans. 28(2), 159–182 (2002)

    Article  Google Scholar 

  39. Thomas, E.: Service-oriented architecture: concepts, technology, and design. Prentice Hall PTR, Upper Saddle River (2005)

    Google Scholar 

  40. Harel, D.: Statecharts: a visual formalism for complex systems. Sci. Comput. Program. 8(3), 231–274 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  41. Haverbeke, M.: Eloquent JavaScript: a modern introduction to programming. No Starch Press Series, No Starch Press (2011)

    Google Scholar 

  42. Hierons, R.M., Bogdanov, K., Bowen, J.P., Cleaveland, R., Derrick, J., Dick, J., Gheorghe, M., Harman, M., Kapoor, K., Krause, P., Lüttgen, G., Anthony, J.H., Simons, S., Vilkomir, M.R., Hussein, Z.: Using formal specifications to support. ACM Comput. Surv 41(2), 9:1–9:76 (2009)

  43. IETF. The Constrained Application Protocol (CoAP)—RFC 7252 (2014). https://tools.ietf.org/html/rfc7252

  44. Jehan, S., Pill, I., Wotawa, F.: Functional SOA testing based on constraints. In: 8th International Workshop on Automation of Software Test, AST 2013, San Francisco, May 18–19, pp. 33–39 (2013)

  45. Jensen, F.V., Kjærulff, U., Kristiansen, B., Langseth, H., Skaanning, C., Vomlel, J., Vomlelová, M.: The SACSO methodology for troubleshooting complex systems. AI EDAM 15, 321–333 (2001)

  46. Joshi, R., Lamport, L., Matthews, J., Tasiran, S., Tuttle, M.R.: Yuan, Y.: Checking cache-coherence protocols with TLA\(^{\text{+}}\). Formal Methods Syst. Des. 22(2), 125–131 (2003)

  47. Juszczyk, L., Truong, H.L., Dustdar, S.: GENESIS—a framework for automatic generation and steering of testbeds of complex web services. In: ICECCS, pp. 131–140. IEEE Computer Society (2008)

  48. Kaschner, K., Lohmann, N.: Automatic test case generation for interacting services. In: ICSOC Workshops, Lecture Notes in Computer Science, vol. 5472, pp. 66-78. Springer (2008)

  49. Khinchin, A.I.A.: Mathematical foundations of information theory. Dover Books on Mathematics. Dover (1957)

  50. Lamport, L.: Specifying systems. The TLA+ Language and Tools for Hardware and Software Engineers. Addison-Wesley (2002)

  51. Lamport, L.: The PlusCal Algorithm Language. In: Theoretical Aspects of Computing—ICTAC 2009: 6th International Colloquium, Kuala Lumpur, Malaysia, August 16–20, 2009. Proceedings. Springer, Berlin, Heidelberg (2009)

  52. Lampropoulos, L., Sagonas, K.F.: Automatic WSDL-guided test case generation for PropEr testing of web services. In: WWV, EPTCS, vol. 98, pp. 3–16 (2012)

  53. Lohmann, N., Wolf, K.: Realizability is controllability. In: WS-FM, Lecture Notes in Computer Science, vol. 6194, pp. 110–127. Springer (2009)

  54. Finn, A.L., Jensen, V.: Lazy propagation: a junction tree inference algorithm based on lazy evaluation. Artif. Intell. 113(1), 203–245 (1999)

    MathSciNet  MATH  Google Scholar 

  55. Maesano, A.-P.: Bayesian dynamic scheduling for service composition testing. Ph.D. Thesis, Université Pierre et Marie Curie—Paris VI, 2015

  56. Mayer, P., Lübke, D.: Towards a BPEL unit testing framework. In: TAV-WEB Proceedings of the 2006 Workshop on Testing, Analysis, and Verification of Web Services and Applications. TAV-WEB ’06, pp. 33–42. ACM, New York (2006). doi:10.1145/1145718.1145723

  57. Mei, L., Chan, W.K., Tse, T.H., Merkel, R.G.: XML-manipulating test case prioritization for XML-manipulating services. J. Syst. Softw. 84(4), 603–619 (2011)

    Article  Google Scholar 

  58. Mirarab, S., Tahvildari, L.: A prioritization approach for software test cases based on bayesian networks. In: Dwyer, M., Lopes, A. (eds.) Fundamental Approaches to Software Engineering. Lecture Notes in Computer Science, vol. 4422, pp. 276–290. Springer, Berlin, Heidelberg (2007)

  59. Namin, A.S., Sridharan, M.: Bayesian reasoning for software testing. Proceedings of the FSE/SDP workshop on future of software engineering research. FoSER ’10, pp. 349–354. ACM, New York (2010)

  60. Newcombe, C.: Why Amazon chose TLA +. In: Aït Ameur, Y., Schewe, K.-D., (eds.) Abstract State Machines, Alloy, B, TLA, VDM., Z—4th International Conference, ABZ 2014, Toulouse, France, June 2–6, 2014. Proceedings, Lecture Notes in Computer Science, vol. 8477, pp. 25–39. Springer (2014)

  61. Newcombe, C., Rath, T., Zhang, F., Munteanu, B., Brooker, M., Deardeuff, M.: How Amazon web services uses formal methods. Commun. ACM 58(4), 66–73 (2015)

    Article  Google Scholar 

  62. Newcomer, E.: Understanding Web Services: XML, WSDL, SOAP, and UDDI. Independent technology guides. Addison-Wesley (2002)

  63. Newcomer, E., Lomow, G.: Understanding SOA with Web Services. Independent technology guides. Addison-Wesley (2005)

  64. Newman, S.: Building microservices: designing fine-grained systems, 1st edn. O’Reilly, Sebastopol (2015)

  65. Nguyen, C. D., Marchetto, A., Tonella, P.: Change sensitivity based prioritization for audit testing of webservice compositions. In: Software Testing, Verification and Validation Workshops (ICSTW), 2011 IEEE Fourth International Conference, pp. 357–365. IEEE (2011)

  66. OASIS. Web Services Business Process Execution Language Version 2.0 (2007). http://docs.oasis-open.org/wsbpel/2.0/wsbpel-v2.0.pdf

  67. OASIS. MQTT Version 3.1.1 (2014). http://docs.oasis-open.org/mqtt/mqtt/v3.1.1/os/mqtt-v3.1.1-os.html

  68. Object Management Group (OMG). Uml testing profile, version 1.2. http://www.omg.org/spec/UTP/1.2

  69. Oracle. Automating Testing of SOA Composite Applications (2016). http://bit.ly/2bhzr5F

  70. Parsons, S.: Probabilistic graphical models: principles and techniques by Daphne Koller and Nir Friedman, MIT Press, 1231 pp., ISBN 0-262-01319-3. Knowl. Eng. Rev. 26(02), 237–238 (2011)

  71. Pearl, J.: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann Publishers Inc., San Francisco (1988)

    MATH  Google Scholar 

  72. Perera, C., Zaslavsky, A.B., Christen, P., Georgakopoulos, D.: Sensing as a service model for smart cities supported by internet of things. CoRR. arXiv:1307.8198 (2013)

  73. Pezzè, M., Zhang, C.: Automated test oracles: a survey. Adv. Comput. 95, 1–48 (2015)

  74. Rees, K., Coolen, F.P.A., Goldstein, M., Wooff, D.A.: Managing the uncertainties of software testing: a Bayesian approach. Qual. Reliab. Eng. Int. 17(3), 191–203 (2001)

    Article  Google Scholar 

  75. Reiter, R.: A theory of diagnosis from first principles. Artif. Intell. 32(1), 57–95 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  76. Schnoebelen, P.: The complexity of temporal logic model checking. In: Balbiani, P., Suzuki, N.-Y., Wolter, F., Zakharyaschev, M., (eds.) Advances in Modal Logic 4, papers from the fourth conference on “Advances in Modal logic,” held in Toulouse (France) in October 2002, pp. 393–436. King’s College Publications (2002)

  77. Shamsoddin-Motlagh, E.: A survey of service oriented architecture systems testing. arXiv:1212.3248 (2012)

  78. Skaanning, C., Jensen, F.V., Kjærulff, U.: Printer Troubleshooting Using Bayesian Networks. In: Logananthara, R., Palm, G., Ali, M. (eds.) Intelligent Problem Solving. Methodologies and Approaches, Lecture Notes in Computer Science, vol. 1821, pp. 367–380. Springer, Berlin, Heidelberg (2000)

  79. Stokkink, G., Timmer, M., Stoelinga, M.: Talking quiescence: a rigorous theory that supports parallel composition, action hiding and determinisation. In: MBT, EPTCS, vol. 80, pp. 73–87 (2012)

  80. Tsai, W.T., Chen, Y., Paul, R., Huang, H., Zhou, X., Wei, X.: Adaptive testing, oracle generation, and test case ranking for Web services. In: Computer Software and Applications Conference, 2005. COMPSAC 2005. 29th Annual International, vol. 1, pp. 101–106 (vol. 2). IEEE (2005)

  81. Wang, H., Zhou, Q., Shi, Y.: Describing and verifying web service composition using TLA reasoning. In: 2010 IEEE International Conference on Services Computing, SCC 2010, Miami, July 5–10, 2010, pp. 234–241. IEEE Computer Society (2010)

  82. Web Hypertext Application Technology Working Group (WHATWG). Web sockets, in HTML Living Standard (2016). https://html.spec.whatwg.org/multipage/comms.html#network

  83. Wilde, E., Pautasso, C. (eds.) REST: From Research to Practice. Springer (2011)

  84. Wooff, D.A., Goldstein, M., Coolen, F.P.A.: Bayesian graphical models for software testing. Softw. Eng. IEEE Trans. 28(5), 510–525 (2002)

    Article  Google Scholar 

  85. Wotawa, F., Schulz, M., Pill, I., Jehan, S., Leitner, P., Hummer, W., Schulte, S., Hoenisch, P., Dustdar, S.: Fifty shades of grey in SOA testing. In: 2013 IEEE Sixth International Conference on Software Testing, Verification and Validation, Workshops Proceedings, Luxembourg, Luxembourg, March 18–22, 2013, pp. 154–157. IEEE Computer Society (2013)

  86. Wu, C.-S., Lee, Y.-T.: Automatic SaaS test cases generation based on SOA in the cloud service. In: CloudCom, pp. 349–354. IEEE Computer Society (2012)

  87. Yoo, S., Harman, M.: Regression testing minimization, selection and prioritization: a survey. Softw. Test. Verif. Reliab. 22(2), 67–120 (2012)

    Article  Google Scholar 

Download references

Acknowledgments

This research has been conducted in the context of the MIDAS project (EC FP7 Project Number 318786) partially funded by the European Commission.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lom Messan Hillah.

Appendices

Appendix A: TCM documents of Virtual Portal and MPIIXS components

This section shows the contents of the Test Configuration Model XML documents of the Virtual Portal and MPIIXS components, and of the complete topology depicted in Fig. 16.

1.1 A.1: TCM document of Virtual Portal

figure a

1.2 A.2: TCM document of MPIIXS

figure b

1.3 A.3: TCM document of the complete topology

figure c

Appendix B: SBM documents of Virtual Portal and MPIIXS components

This section shows the contents of the Service Behaviour Model XML documents of the the Virtual Portal and MPIIXS components, depicted in Fig. 17.

1.1 B.1: SBM document of Virtual Portal

figure d

1.2 B.2: SBM document of MPIIXS

figure e

Appendix C: Configuration file

This section shows the content of the configuration file for the Healthcare Pilot experiment described in Sect. 4.1.

figure f

Appendix D: Test generation directive

This section shows the content of a typical test generation directive, as can be issued by the dynamic scheduler or the end user upon first upload of the archive containing all the DSUT modelling artefacts onto the MIDAS platform, before invoking the functional test method.

figure g

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hillah, L.M., Maesano, AP., De Rosa, F. et al. Automation and intelligent scheduling of distributed system functional testing. Int J Softw Tools Technol Transfer 19, 281–308 (2017). https://doi.org/10.1007/s10009-016-0440-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10009-016-0440-3

Keywords

Navigation