A Survey on Testing Distributed and Heterogeneous Systems: The State of the Practice

Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 743)

Abstract

Distributed and heterogeneous systems (DHS), running over interconnected mobile and cloud-based platforms, are used in a growing number of domains for provisioning end-to-end services to users. Testing DHS is particularly important and challenging, with little support being provided by current tools. In order to assess the current state of the practice regarding the testing of DHS and identify opportunities and priorities for research and innovation initiatives, we conducted an exploratory survey that was responded by 147 software testing professionals that attended industry-oriented software testing conferences. The survey allowed us to assess the relevance of DHS in software testing practice, the most important features to be tested in DHS, the current status of test automation and tool sourcing for testing DHS, and the most desired features in test automation solutions for DHS. Some follow up interviews allowed us to further investigate drivers and barriers for DHS test automation. We expect that the results presented in the paper are of interest to researchers, tool vendors and service providers in this field.

Keywords

Software testing Distributed systems Heterogeneous systems Systems of systems State of the practice 

Notes

Acknowledgements

This research work was performed in scope of the project NanoSTIMA. Project “NanoSTIMA: Macro-to-Nano Human Sensing: Towards Integrated Multimodal Health Monitoring and Analytics/NORTE-01-0145-FEDER-000016” is financed by the North Portugal Regional Operational Programme (NORTE 2020), under the PORTUGAL 2020 Partnership Agreement, and through the European Regional Development Fund (ERDF).

References

  1. 1.
    AAL4ALL: Ambient Assisted Living For All (2015). http://www.aal4all.org
  2. 2.
    Beizer, B.: Software Testing Techniques. Dreamtech Press (2003)Google Scholar
  3. 3.
    Boehm, B.: Some future software engineering opportunities and challenges. In: Nanz, S. (ed.) The Future of Software Engineering, pp. 1–32. Springer, Heidelberg (2011). doi: 10.1007/978-3-642-15187-3_1 Google Scholar
  4. 4.
    DoD: systems engineering guide for systems of systems. Technical report, Office of the Deputy Under Secretary of Defense for Acquisition and Technology, Systems and Software Engineering Version 1.0 (2008)Google Scholar
  5. 5.
    Edwards, S.H.: A framework for practical, automated black-box testing of component-based software. Softw. Test. Verification Reliab. 11(2), 97–111 (2001)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Ghazi, A.N., Petersen, K., Börstler, J.: Heterogeneous systems testing techniques: an exploratory survey. In: Winkler, D., Biffl, S., Bergsmann, J. (eds.) SWQD 2015. LNBIP, vol. 200, pp. 67–85. Springer, Cham (2015). doi: 10.1007/978-3-319-13251-8_5 Google Scholar
  7. 7.
    IEEE: IEEE Standard Glossary of Software Engineering Terminology. IEEE Std 610.12-1990, pp. 1–84, December 1990Google Scholar
  8. 8.
    ISTQB: International Software Testing Qualifications Board, March 2016. http://www.istqb.org/
  9. 9.
    ISTQB: ISTQB worldwide software testing practices report 2015–2016. Technical report (2016). http://www.istqb.org/references/surveys/istqb-worldwide-software-testing-practices-report-2015-2016.html
  10. 10.
    Lima, B., Faria, J.P.: Automated testing of distributed and heterogeneous systems based on UML sequence diagrams. In: Lorenz, P., Cardoso, J., Maciaszek, L.A., Sinderen, M. (eds.) ICSOFT 2015. CCIS, vol. 586, pp. 380–396. Springer, Cham (2016). doi: 10.1007/978-3-319-30142-6_21 CrossRefGoogle Scholar
  11. 11.
    Lima, B., Faria, J.P.: Testing distributed and heterogeneous systems: state of the practice. In: Proceedings of the 11th International Joint Conference on Software Technologies - Volume 1: ICSOFT-EA, pp. 69–78 (2016)Google Scholar
  12. 12.
    Linzhang, W., Jiesong, Y., Xiaofeng, Y., Jun, H., Xuandong, L., Guo, Z.: Generating test cases from UML activity diagram based on gray-box method. In: 11th Asia-Pacific Software Engineering Conference, pp. 284–291. IEEE (2004)Google Scholar
  13. 13.
    Mills, H.D., Dyer, M., Linger, R.C.: Cleanroom software engineering (1987)Google Scholar
  14. 14.
    Ostrand, T.: White-box testing. In: Encyclopedia of Software Engineering (2002)Google Scholar
  15. 15.
    Ramler, R., Wolfmaier, K.: Economic perspectives in test automation: balancing automated and manual testing with opportunity cost. In: Proceedings of the 2006 International Workshop on Automation of Software Test, AST 2006, NY, USA, pp. 85–91. ACM, New York (2006). http://doi.acm.org/10.1145/1138929.1138946
  16. 16.
    Tassey, G.: The Economic impacts of inadequate infrastructure for software testing. Technical report, National Institute of Standards and Technology (2002)Google Scholar
  17. 17.
    Torens, C., Ebrecht, L.: RemoteTest: a framework for testing distributed systems. In: 2010 Fifth International Conference on Software Engineering Advances (ICSEA), pp. 441–446, August 2010Google Scholar
  18. 18.
    Wohlin, C., Höst, M., Henningsson, K.: Empirical research methods in software engineering. In: Conradi, R., Wang, A.I. (eds.) Empirical Methods and Studies in Software Engineering, pp. 7–23. Springer, Heidelberg (2003)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.INESC TEC, FEUP CampusPortoPortugal
  2. 2.Faculty of EngineeringUniversity of PortoPortoPortugal

Personalised recommendations