Skip to main content

Evaluating the Effects of Different Requirements Representations on Writing Test Cases

  • Conference paper
  • First Online:
Requirements Engineering: Foundation for Software Quality (REFSQ 2020)

Abstract

[Context and Motivation] One must test a system to ensure that the requirements are met, thus, tests are often derived manually from requirements. However, requirements representations are diverse; from traditional IEEE-style text, to models, to agile user stories, the RE community of research and practice has explored various ways to capture requirements. [Question/problem] But, do these different representations influence the quality or coverage of test suites? The state-of-the-art does not provide insights on whether or not the representation of requirements has an impact on the coverage, quality, or size of the resulting test suite. [Results] In this paper, we report on a family of three experiment replications conducted with 148 students which examines the effect of different requirements representations on test creation. We find that, in general, the different requirements representations have no statistically significant impact on the number of derived tests, but specific affordances of the representation effect test quality, e.g., traditional textual requirements make it easier to derive less abstract tests, whereas goal models yield less inconsistent test purpose descriptions. [Contribution] Our findings give insights on the effects of requirements representation on test derivation for novice testers. Our work is limited in the use of students.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bencomo, N., Whittle, J., Sawyer, P., Finkelstein, A., Letier, E.: Requirements reflection: requirements as runtime entities. In: International Conference on Software Engineering, (ICSE), pp. 199–202. ACM/IEEE (2010)

    Google Scholar 

  2. Brill, O., Schneider, K., Knauss, E.: Videos vs. use cases: can videos capture more requirements under time pressure? In: Wieringa, R., Persson, A. (eds.) REFSQ 2010. LNCS, vol. 6182, pp. 30–44. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-14192-8_5

    Chapter  Google Scholar 

  3. Cohn, M.: User Stories Applied: For Agile Software Development. Addison Wesley Longman Publishing Co., Inc., Redwood City (2004)

    Google Scholar 

  4. Cruzes, D.S., Dyba, T.: Recommended steps for thematic synthesis in software engineering In: International Symposium on Empirical Software Engineering and Measurement, pp. 275–284, September 2011

    Google Scholar 

  5. Dalpiaz, F., Franch, X., Horkoff, J.: istar 2.0 language guide (2016). https://arxiv.org/abs/1605.07767

  6. de Oliveira Neto, F.G., Horkoff, J., Knauss, E., Kasauli, R., Liebel, G.: Challenges of aligning requirements engingeering and system testing in large-scale agile: A multiple case study. In: 2017 IEEE 25th International Requirements Engineering Conference Workshops (REW), pp. 315–322, September 2017

    Google Scholar 

  7. Felderer, M., Beer, A., Peischl, B.: On the role of defect taxonomy types for testing requirements: Results of a controlled experiment. In: 2014 40th Euromicro Conference on Software Engineering and Advanced Applications (SEAA), pp. 377–384 (2014)

    Google Scholar 

  8. Felderer, M., Herrmann, A.: Manual test case derivation from uml activity diagrams and state machines: a controlled experiment. Inf. Soft. Technol. 61, 1–15 (2015)

    Article  Google Scholar 

  9. Felderer, M., Herrmann, A.: Comprehensibility of system models during test design: a controlled experiment comparing uml activity diagrams and state machines. Soft. Qual. J. 27(1), 125–147 (2019)

    Article  Google Scholar 

  10. Feldt, R., et al.: Four commentaries on the use of students and professionals in empirical software engineering experiments. Empir. Softw. Eng. 23(6), 3801–3820 (2018). https://doi.org/10.1007/s10664-018-9655-0

    Article  Google Scholar 

  11. Fleiss, J.L., Levin, B., Paik, M.C.: Statistical Methods for Rates and Proportions. Wiley Series in Probability and Statistics, 3rd edn. Wiley, Hoboken (2003)

    Book  MATH  Google Scholar 

  12. Hadar, I., Reinhartz-Berger, I., Kuflik, T., Perini, A., Ricca, F., Susi, A.: Comparing the comprehensibility of requirements models expressed in use case and tropos: results from a family of experiments. Inf. Soft. Technol. 55(10), 1823–1843 (2013)

    Article  Google Scholar 

  13. Häser, F., Felderer, M., Breu, R.: Is business domain language support beneficial for creating test case specifications: a controlled experiment. Inf. Softw. Technol. 79, 52–62 (2016)

    Article  Google Scholar 

  14. Hayes, A.F., Krippendorff, K.: Answering the call for a standard reliability measure for coding data. Commun. Methods Meas. 1(1), 77–89 (2007)

    Article  Google Scholar 

  15. Horkoff, J., et al.: Goal-oriented requirements engineering: an extended systematic mapping study. Requir. Eng. 24(2), 133–160 (2017). https://doi.org/10.1007/s00766-017-0280-z

    Article  Google Scholar 

  16. ISO/IEC/IEEE: Software and Systems Engineering - Soft. testing - Part 3: Test documentation. ISO/IEC/IEEE standard 29119–3:2013 (2016)

    Google Scholar 

  17. ISO/IEC/IEEE: Systems and Software Engineering - Life cycle processes - Requirements Engineering. ISO/IEC/IEEE standard 29148:2018 (2018)

    Google Scholar 

  18. Karac, E.I., Turhan, B., Juristo, N.: A controlled experiment with novice developers on the impact of task description granularity on software quality in test-driven development. IEEE Trans. on Soft. Eng. 1 (2019). https://doi.org/10.1109/TSE.2019.2920377

  19. Kasauli, R., Knauss, E., Kanagwa, B., Nilsson, A., Calikli, G.: Safety-critical systems and agile development: A mapping study. In: 2018 44th Euromicro Conference on Software Engineering and Advanced Applications (SEAA), pp. 470–477. IEEE (2018)

    Google Scholar 

  20. Larkin, J.H., Simon, H.A.: Why a diagram is (sometimes) worth ten thousand words. Cognit. Sci. 11(1), 65–100 (1987)

    Article  Google Scholar 

  21. Massey, A.K., Otto, P.N., Antón, A.I.: Evaluating legal implementation readiness decision-making. IEEE Trans. Soft. Eng. 41(6), 545–564 (2015)

    Article  Google Scholar 

  22. Matulevičius, R., Heymans, P.: Comparing goal modelling languages: an experiment. In: Sawyer, P., Paech, B., Heymans, P. (eds.) REFSQ 2007. LNCS, vol. 4542, pp. 18–32. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-73031-6_2

    Chapter  Google Scholar 

  23. de Oliveira Neto, F.G., Torkar, R., Feldt, R., Gren, L., Furia, C.A., Huang, Z.: Evolution of statistical analysis in empirical software engineering research: current state and steps forward. J. Syst. Softw. 156, 246–267 (2019)

    Article  Google Scholar 

  24. Salman, I., Misirli, A.T., Juristo, N.: Are students representatives of professionals in software engineering experiments? In: 2015 IEEE/ACM 37th International Conference on Software Engineering, vol. 1, pp. 666–676. IEEE (2015)

    Google Scholar 

  25. Sharafi, Z., Marchetto, A., Susi, A., Antoniol, G., Guéhéneuc, Y.G.: An empirical study on the efficiency of graphical vs. textual representations in requirements comprehension. In: 2013 21st International Conference on Program Comprehension (ICPC), pp. 33–42. IEEE (2013)

    Google Scholar 

  26. Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., Wessln, A.: Experimentation in Software Engineering. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-29044-2

    Book  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Francisco Gomes de Oliveira Neto .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

de Oliveira Neto, F.G., Horkoff, J., Svensson, R., Mattos, D., Knauss, A. (2020). Evaluating the Effects of Different Requirements Representations on Writing Test Cases. In: Madhavji, N., Pasquale, L., Ferrari, A., Gnesi, S. (eds) Requirements Engineering: Foundation for Software Quality. REFSQ 2020. Lecture Notes in Computer Science(), vol 12045. Springer, Cham. https://doi.org/10.1007/978-3-030-44429-7_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-44429-7_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-44428-0

  • Online ISBN: 978-3-030-44429-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics