Advertisement

Local Observability and Controllability Enforcement in Distributed Testing

  • Bruno LimaEmail author
  • João Pascoal Faria
  • Robert Hierons
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1010)

Abstract

To ensure interoperability and the correct end-to-end behavior of heterogenous distributed systems, it is important to conduct integration tests that verify the interactions with the environment and between the system components in key scenarios. The automation of such integration tests requires that test components are also distributed, with local testers deployed close to the system components, coordinated by a central tester. In such a test architecture, it is important to maximize the autonomy of the local testers to minimize the communication overhead and maximize the fault detection capability. A test scenario is called locally observable and locally controllable, if conformance errors can be detected locally and test inputs can be decided locally, respectively, by the local testers, without the need for exchanging coordination messages between the test components during test execution (i.e., without any communication overhead). For test scenarios specified by means of UML sequence diagrams that don’t exhibit those properties, we present in this paper an approach with tool support to automatically find coordination messages that, added to the given scenario, make it locally controllable and locally observable.

Keywords

Model-based testing Observability Controllability Integration testing Distributed systems UML 

References

  1. 1.
    Boehm, B.: Some future software engineering opportunities and challenges. In: Nanz, S. (ed.) The Future of Software Engineering, pp. 1–32. Springer, Berlin (2011).  https://doi.org/10.1007/978-3-642-15187-3_1CrossRefGoogle Scholar
  2. 2.
    Boroday, S., Petrenko, A., Ulrich, A.: Implementing MSC tests with quiescence observation. In: Núñez, M., Baker, P., Merayo, M.G. (eds.) FATES/TestCom -2009. LNCS, vol. 5826, pp. 49–65. Springer, Heidelberg (2009).  https://doi.org/10.1007/978-3-642-05031-2_4CrossRefGoogle Scholar
  3. 3.
    Durr, E., Van Katwijk, J.: VDM++, a formal specification language for object-oriented designs. In: CompEuro 1992. Proceedings of Computer Systems and Software Engineering, pp. 214–219. IEEE (1992)Google Scholar
  4. 4.
    Hierons, R.M.: Overcoming controllability problems in distributed testing from an input output transition system. Distrib. Comput. 25(1), 63–81 (2012).  https://doi.org/10.1007/s00446-011-0153-5CrossRefzbMATHGoogle Scholar
  5. 5.
    Hierons, R.M.: Combining centralised and distributed testing. ACM Trans. Softw. Eng. Methodol. 24(1), 5:1–5:29 (2014).  https://doi.org/10.1145/2661296CrossRefGoogle Scholar
  6. 6.
    Hierons, R.M., Merayo, M.G., Núñez, M.: Using time to add order to distributed testing. In: Giannakopoulou, D., Méry, D. (eds.) FM 2012. LNCS, vol. 7436, pp. 232–246. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-32759-9_20CrossRefGoogle Scholar
  7. 7.
    Larsen, P.G., et al.: VDM-10 language manual. Technical report (2016)Google Scholar
  8. 8.
    Lima, B.M.C., Faria, J.C.P.: Towards decentralized conformance checking in model-based testing of distributed systems. In: 2017 IEEE International Conference on Software Testing, Verification and Validation Workshops (ICSTW), pp. 356–365, March 2017.  https://doi.org/10.1109/ICSTW.2017.64
  9. 9.
    Lima, B., Faria, J.P.: Automated testing of distributed and heterogeneous systems based on UML sequence diagrams. In: Lorenz, P., Cardoso, J., Maciaszek, L.A., van Sinderen, M. (eds.) ICSOFT 2015. CCIS, vol. 586, pp. 380–396. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-30142-6_21CrossRefGoogle Scholar
  10. 10.
    Mitchell, B.: Resolving race conditions in asynchronous partial order scenarios. IEEE Trans. Softw. Eng. 31(9), 767–784 (2005).  https://doi.org/10.1109/TSE.2005.104CrossRefGoogle Scholar
  11. 11.
    OMG: OMG Unified Modeling Language TM (OMG UML) Version 2.5. Technical report, Object Management Group (2015)Google Scholar
  12. 12.
    Tassey, G.: The economic impacts of inadequate infrastructure for software testing. National Institute of Standards and Technology, RTI Project 7007(011) (2002)Google Scholar
  13. 13.
    Ulrich, A., König, H.: Architectures for testing distributed systems. In: Csopaki, G., Dibuz, S., Tarnay, K. (eds.) Testing of Communicating Systems. ITIFIP, vol. 21, pp. 93–108. Springer, Boston (1999).  https://doi.org/10.1007/978-0-387-35567-2_7CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Faculty of Engineering of the University of PortoPortoPortugal
  2. 2.INESC TECPortoPortugal
  3. 3.The University of SheffieldSheffieldUK

Personalised recommendations