Advertisement

Why Software Testing Fails: Common Pitfalls Observed in a Critical Smart Metering Project

  • Stefan Mohacsi
  • Rudolf RamlerEmail author
Conference paper
Part of the Lecture Notes in Business Information Processing book series (LNBIP, volume 338)

Abstract

Over the last decades a considerable share of software engineering research has been dedicated to the area of software testing. Still, however, testing often fails or causes major problems in practice. In this paper we present insights and experiences from a large project in the energy sector. The obligatory switch from analog energy meters to smart metering technology poses a big challenge for many energy providers. Apart from technical issues concerning meters and transmission technology, the adaption of the internal business processes together with the development of backend software can turn out to be more difficult than expected. The criticality, size and complexity of the analyzed project are reflected in software and system testing, where the underestimated effort, mistakes, and wrong decisions caused serious difficulties. In our work we describe the observed testing problems and the underlying causes. Subsequently, we compare the identified problems with a catalogue of commonly known testing pitfalls and anti-patterns. The results show that the majority of the observed problems are not new or specific to the studied project. Furthermore, additional candidates for extending the list of common pitfalls are identified. Besides recommendations on how to mitigate the problems in the studied project, we conclude with the general insight that there is a great potential to improve software testing practice by developing measures for early recognition, communication, and avoiding of common mistakes.

Keywords

Software testing System testing Test management Common testing pitfalls Testing Anti-Patterns Smart metering 

Notes

Acknowledgments

This research has been supported by the Austrian Research Pro-motion Agency, the Austrian Ministry for Transport, Innovation and Technology, the Federal Ministry of Science, Research and Economy, and the Province of Upper Austria in the frame of the COMET center SCCH (FFG 844597).

References

  1. 1.
    Tricentis: Software Fail Watch, 5th Ed. White paper, 27 February 2018, Tricentis (2018). https://www.tricentis.com/software-fail-watch/. Accessed 25 Aug 2018
  2. 2.
    ISO/IEC/IEEE: International Standard 29119-1 Software and systems engineering - Software testing - Part 1: Concepts and definitions. Institute of Electrical and Electronics Engineers (2013)Google Scholar
  3. 3.
    Jones, C., Bonsignour, O.: The Economics of Software Quality. Addison-Wesley Professional, Upper Saddle River (2011)Google Scholar
  4. 4.
    Bertolino, A.: Software testing research: achievements, challenges, dreams. In: Future of Software Engineering. IEEE Computer Society (2007)Google Scholar
  5. 5.
    Orso, A., Rothermel, G.: Software testing: a research travelogue (2000–2014). In: Proceedings of the on Future of Software Engineering. ACM (2014)Google Scholar
  6. 6.
    Scargle, J.D.: Publication bias: the “File-Drawer” problem in scientific inference. J. Sci. Explor. 14(1), 91–106 (2000)Google Scholar
  7. 7.
    Tassey, G.: The economic impacts of inadequate infrastructure for software testing. National Institute of Standards and Technology, RTI Project, 7007-011 (2002)Google Scholar
  8. 8.
    Martin, D., Rooksby, J., Rouncefield, M., Sommerville, I.: ‘Good’ organisational reasons for ‘Bad’ software testing: an ethnographic study of testing in a small software company. In: Proceedings of the 29th International Conference on Software Engineering. IEEE Computer Society (2007)Google Scholar
  9. 9.
    Kasurinen, J., Taipale, O., Smolander, K.: Analysis of problems in testing practices. In: Proceedings of the 2009 Asia-Pacific Software Engineering Conference, APSEC 2009. IEEE (2009)Google Scholar
  10. 10.
    Firesmith, D.: Common System and Software Testing Pitfalls: How to Prevent and Mitigate Them: Descriptions, Symptoms, Consequences, Causes, and Recommendations. Addison Wesley Professional, Upper Saddle River (2013)Google Scholar
  11. 11.
    European Commission: Mandate M/441 - Standardisation mandate to CEN, CENELEC and ETSI in the field of measuring instruments for the development of an open architecture for utility meters involving communication protocols enabling interoperability. European Commission, Enterprise and Industry Directorate-General, M/441 EN (2009)Google Scholar
  12. 12.
    Elsberg, M.: Blackout: Tomorrow Will Be Too Late. Penguin Books, London (2017)Google Scholar
  13. 13.
    Firesmith, D.G.: Common system and software testing pitfalls. In: Proceedings of the Team Software Process Symposium 2014 (TSP-2014), Pittsburgh, Pennsylvania. SEI (2014). https://resources.sei.cmu.edu/library/asset-view.cfm?assetID=423692
  14. 14.
    Rooksby, J., Rouncefield, M., Sommerville, I.: Testing in the wild: the social and organisational dimensions of real world practice. Comput. Support. Coop. Work (CSCW) 18(5–6), 559–580 (2009)CrossRefGoogle Scholar
  15. 15.
    Taipale, O., Smolander, K.: Improving software testing by observing practice. In: Proceedings of the 2006 ACM/IEEE International Symposium on Empirical Software Engineering. ACM (2006)Google Scholar
  16. 16.
    Mohacsi, S., Felderer, M., Beer, A.: Estimating the cost and benefit of model-based testing: a decision support procedure for the application of model-based testing in industry. In: Proceedings of the Euromicro SEAA, Madeira, Portugal (2015)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Atos IT Solutions and Services GmbHViennaAustria
  2. 2.Software Competence Center Hagenberg GmbHHagenbergAustria

Personalised recommendations