Skip to main content

Who Is Afraid of Test Smells? Assessing Technical Debt from Developer Actions

  • Conference paper
  • First Online:
Testing Software and Systems (ICTSS 2023)

Abstract

Test smells are patterns in test code that may indicate poor code quality. Some recent studies have cast doubt on the accuracy and usefulness of the test smells proposed and studied by the research community. In this study, we aimed to determine whether developers view these test smells as sources of technical debt worth spending effort to remove. We selected 12 substantial open-source software systems and mapped how 19 test smells from the literature were introduced and removed from the code base over time. Out of these 19 smells, our results show that: 1) four test smells were rarely detected in our selected projects; 2) three test smells are removed rapidly from the projects while another three are removed from code bases slowly; 3) the remaining nine test smells did not show a consistent pattern of quick or delayed removal. Our results suggest that the test smells currently being studied by researchers do not capture the true concerns of developers regarding test quality, with current testing tool sets, with only three of the 19 smells studied showing clear evidence of developer concern.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 54.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The study design was approved by Computer Science Department Panel, The University of Manchester Ref: 2023-15405-27595. All authors are available for clarifications.

  2. 2.

    The pipeline code is available at https://github.com/ZhongyanChen/tsObservatory..

  3. 3.

    https://junit.org/junit4/faq.html#running_15, accessed on 2023/03/30.

  4. 4.

    The full data set of this study are provided as supplementary information accompanying this paper at https://figshare.manchester.ac.uk/projects/Evaluating_Test_Smells_in_Open-Source_Projects/164461.

References

  1. Aljedaani, W., et al.: Test smell detection tools: a systematic mapping study. Eval. Assessment Soft. Eng., 170–180 (2021)

    Google Scholar 

  2. Alves, T.L., Ypma, C., Visser, J.: Deriving metric thresholds from benchmark data. In: 2010 IEEE International Conference on Software Maintenance, pp. 1–10. IEEE (2010)

    Google Scholar 

  3. Bai, G.R., Presler-Marshall, K., Fisk, S.R., Stolee, K.T.: Is assertion roulette still a test smell? An experiment from the perspective of testing education. In: 2022 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC), pp. 1–7. IEEE (2022)

    Google Scholar 

  4. Bavota, G., Qusef, A., Oliveto, R., De Lucia, A., Binkley, D.: Are test smells really harmful? An empirical study. Empir. Softw. Eng. 20, 1052–1094 (2015)

    Article  Google Scholar 

  5. Bavota, G., Qusef, A., Oliveto, R., De Lucia, A., Binkley, D.: An empirical analysis of the distribution of unit test smells and their impact on software maintenance. In: 2012 28th IEEE International Conference on Software Maintenance (ICSM), pp. 56–65. IEEE (2012)

    Google Scholar 

  6. Cunningham, W.: The WyCash portfolio management system. ACM SIGPLAN OOPS Messenger 4(2), 29–30 (1992)

    Article  Google Scholar 

  7. Fowler, M.: Refactoring. Addison-Wesley Professional (2018)

    Google Scholar 

  8. Garousi, V., Küçük, B.: Smells in software test code: a survey of knowledge in industry and academia. J. Syst. Softw. 138, 52–81 (2018)

    Article  Google Scholar 

  9. Hogg, R.V., Tanis, E.A., Zimmerman, D.L.: Probability and Statistical Inference, vol. 993. Macmillan New York (1977)

    Google Scholar 

  10. Kim, D.J., Chen, T.H.P., Yang, J.: The secret life of test smells-an empirical study on test smell evolution and maintenance. Empir. Softw. Eng. 26(5), 1–47 (2021)

    Article  Google Scholar 

  11. Lenarduzzi, V., Besker, T., Taibi, D., Martini, A., Fontana, F.A.: A systematic literature review on technical debt prioritization: strategies, processes, factors, and tools. J. Syst. Softw. 171, 110827 (2021)

    Article  Google Scholar 

  12. McDonough, J.E.: Automated unit testing with ABAP. In: Automated Unit Testing with ABAP, pp. 43–98. Apress, Berkeley, CA (2021). https://doi.org/10.1007/978-1-4842-6951-0_5

    Chapter  Google Scholar 

  13. Panichella, A., Panichella, S., Fraser, G., Sawant, A.A., Hellendoorn, V.J.: Test smells 20 years later: detectability, validity, and reliability. Empir. Softw. Eng. 27(7), 170 (2022)

    Article  Google Scholar 

  14. Peruma, A., Almalki, K., Newman, C.D., Mkaouer, M.W., Ouni, A., Palomba, F.: On the distribution of test smells in open source android applications: an exploratory study. In: Proceedings of the 29th Annual International Conference on Computer Science and Software Engineering, pp. 193–202 (2019)

    Google Scholar 

  15. Peruma, A., Almalki, K., Newman, C.D., Mkaouer, M.W., Ouni, A., Palomba, F.: tsDetect: an open source test smells detection tool. In: Proceedings of the 28th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering, pp. 1650–1654 (2020)

    Google Scholar 

  16. Spadini, D., Schvarcbacher, M., Oprescu, A.M., Bruntink, M., Bacchelli, A.: Investigating severity thresholds for test smells. In: Proceedings of the 17th International Conference on Mining Software Repositories, pp. 311–321 (2020)

    Google Scholar 

  17. Tufano, M., et al.: An empirical investigation into the nature of test smells. In: Proceedings of the 31st IEEE/ACM International Conference on Automated Software Engineering, pp. 4–15 (2016)

    Google Scholar 

  18. Van Deursen, A., Moonen, L., Van Den Bergh, A., Kok, G.: Refactoring test code. In: Proceedings of the 2nd International Conference on Extreme Programming and Flexible Processes in Software Engineering (XP2001), pp. 92–95. Citeseer (2001)

    Google Scholar 

  19. Virgínio, T., et al.: JNose: Java test smell detector. In: Proceedings of the XXXIV Brazilian Symposium on Software Engineering, pp. 564–569 (2020)

    Google Scholar 

  20. Virgínio, T., et al.: On the test smells detection: an empirical study on the JNose test accuracy. J. Softw. Eng. Res. Dev. 9, 8 (2021)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhongyan Chen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chen, Z., Embury, S.M., Vigo, M. (2023). Who Is Afraid of Test Smells? Assessing Technical Debt from Developer Actions. In: Bonfanti, S., Gargantini, A., Salvaneschi, P. (eds) Testing Software and Systems. ICTSS 2023. Lecture Notes in Computer Science, vol 14131. Springer, Cham. https://doi.org/10.1007/978-3-031-43240-8_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-43240-8_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-43239-2

  • Online ISBN: 978-3-031-43240-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics