Cost-Benefit Analysis of Using Dependency Knowledge at Integration Testing

  • Sahar TahviliEmail author
  • Markus BohlinEmail author
  • Mehrdad Saadatmand
  • Stig Larsson
  • Wasif Afzal
  • Daniel Sundmark
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10027)


In software system development, testing can take considerable time and resources, and there are numerous examples in the literature of how to improve the testing process. In particular, methods for selection and prioritization of test cases can play a critical role in efficient use of testing resources. This paper focuses on the problem of selection and ordering of integration-level test cases. Integration testing is performed to evaluate the correctness of several units in composition. Further, for reasons of both effectiveness and safety, many embedded systems are still tested manually. To this end, we propose a process, supported by an online decision support system, for ordering and selection of test cases based on the test result of previously executed test cases. To analyze the economic efficiency of such a system, a customized return on investment (ROI) metric tailored for system integration testing is introduced. Using data collected from the development process of a large-scale safety-critical embedded system, we perform Monte Carlo simulations to evaluate the expected ROI of three variants of the proposed new process. The results show that our proposed decision support system is beneficial in terms of ROI at system integration testing and thus qualifies as an important element in improving the integration testing process.


Process improvement Software testing Decision support system Integration testing Test case selection Prioritization Optimization Return on investment 



This work was supported by VINNOVA grant 2014-03397 through the IMPRINT project and the Swedish Knowledge Foundation (KKS) grant 20130085 through the TOCSYC project and the ITS-EASY industrial research school. Special thanks to Johan Zetterqvist, Ola Sellin and Mahdi Sarabi at Bombardier Transportation, Västerås-Sweden.


  1. 1.
    Tahvili, S., Saadatmand, M., Larsson, S., Afzal, W., Bohlin, M., Sudmark, D., Dynamic integration test selection based on test case dependencies. In: The 11th Workshop on Testing: Academia-Industry Collaboration, Practice and Research Techniques (TAIC PART) (2016)Google Scholar
  2. 2.
    Yoo, S., Harman, M., Regression testing minimization, selection, prioritization: a survey. Softw. Test. Verification Reliab. 22(2), 67–120 (2012)Google Scholar
  3. 3.
    Catal, C., Mishra, D., Test case prioritization: A systematic mapping study. Soft. Qual. Journal, 2013Google Scholar
  4. 4.
    Engström, E., Runeson, P., Ljung, A.: Improving regression testing transparency and efficiency with history-based prioritization-an industrial case study, pp. 367–376 (2011)Google Scholar
  5. 5.
    Bell, J.: Detecting, isolating, and enforcing dependencies among and within test cases. In: 22nd International Symposium on Foundations of Software Engineering (2014)Google Scholar
  6. 6.
    Zhang, S., Jalali, D., Wuttke, J., Mucslu, K., Lam, W., Ernst, M., Notkin, D.: Empirically revisiting the test independence assumption. In: International Symposium on Software Testing and Analysis (2014)Google Scholar
  7. 7.
    Campanella, J.: Principles of quality costs: principles, implementation and use. ASQ Quality Press (1999)Google Scholar
  8. 8.
    Black, R.: What it managers should know about testing: How to analyze the return on the testing investment (2004)Google Scholar
  9. 9.
    British Standards Institution. Guide to the economics of quality. Proc. cost model. B.S. (Series). BSI (1992)Google Scholar
  10. 10.
    Crosby, P.: Quality is free: the art of making quality certain. Penguin (1980)Google Scholar
  11. 11.
    Slaughter, S., Harter, D., Krishnan, M.: Evaluating the cost of software quality. Commun. ACM 41(8), 67–73 (1998)Google Scholar
  12. 12.
    Krasner, H.: Using the cost of quality approach for software. Crosstalk J. Def. Softw. Eng. 11, 6–11 (1998)Google Scholar
  13. 13.
    Boehm, B., Huang, L., Jain, A., Madachy, R.: The R.O.I of software dependability: The iDAVE model. IEEE Softw. 21(3), 54–61 (2004)Google Scholar
  14. 14.
    Afzal, W., Alone, S., Glocksien, K., Torkar, R.: Software test process improvement approaches: a systematic literature review and an industrial case study. J. Syst. Softw. 111, 1–33 (2016)Google Scholar
  15. 15.
    Wagner. S. Software product quality control. Springer, 2013Google Scholar
  16. 16.
    Nikolik, B.: Software quality assurance economics. Info. Softw. TechnolGoogle Scholar
  17. 17.
    Leung, H., White, L.: A cost model to compare regression test strategies. In: Proceedings of the 1991 Conference on Software Maintenance (1991)Google Scholar
  18. 18.
    Black, R.: Managing the Testing Process: Practical Tools and Techniques for Managing Hardware and Software Testing. Wiley Publishing (2009)Google Scholar
  19. 19.
    Münch, S., Brandstetter, P., Clevermann, K., Kieckhoefel, O., Reiner Schäfer, E.: The return on investment of test automation. Pharmaceutical EngGoogle Scholar
  20. 20.
    Hayduk, B.: Maximizing ROI and Avoiding the Pitfalls of Test Automation. Real-Time Technology Solutions Inc. (2009)Google Scholar
  21. 21.
    Hoffman, D.: Cost benefits analysis of test automation. Software Quality Methods LLC (1999)Google Scholar
  22. 22.
    Mohacsi, S., Felderer, M., Beer, A.: Estimating the cost, benefit of model-based testing: a decision support procedure for the application of model-based testing in industry. In: Proceedings of the 2015 41st Euromicro Conference on Software Engineering and Advanced Applications, SEAA ’15 (2015)Google Scholar
  23. 23.
    Felderer, M., Beer, A.: Estimating the return on investment of defect taxonomy supported system testing in industrial projects. In: Proceedings of the 2012 38th Euromicro Conference on Software Engineering and Advanced Applications, SEAA ’12 (2012)Google Scholar
  24. 24.
    Tahvili, S., Afzal, W., Saadatmand, M., Bohlin, M., Sundmark, D., Larsson, S.: Towards earlier fault detection by value-driven prioritization of test cases using ftopsis. In: Proceedings of the 13th International Conference on Information Technology: New Generations (2016)Google Scholar
  25. 25.
    Debroy, V., Wong, W.: On the estimation of adequate test set size using fault failure rates. J. Syst. Softw. 84, 587–602 (2011)Google Scholar
  26. 26.
    Musa, J., Okumoto, K.: A logarithmic poisson execution time model for software reliability measurement. In: Proceedings of the 7th International Conference on Software engineeringGoogle Scholar
  27. 27.
    Rico, D.: ROI of Software Process Improvement. J Ross Publishing (2004)Google Scholar
  28. 28.
    Runeson, P., Höst, M., Rainer, A., Regnell, R.: Case Study Research in Software Engineering. WILEY (2012)Google Scholar
  29. 29.
    Bohlin, M., Wärja, M.: Maintenance optimization with duration-dependent costs. Ann. Oper. Res. 224(1), 1–23 (2015)Google Scholar
  30. 30.
    Bohlin, M., Holst, A., Ekman, J., Sellin, O., Lindström, B., Larsen, S.: Statistical anomaly detection for train fleets. In: Proceedings of the 21\(^{\rm st}\) Innovative Applications of Artificial Intelligence Conference (2012)Google Scholar
  31. 31.
    Marneffe, M., Manning, C.: Stanford typed dependencies manual. Technical report, Stanford University (2008)Google Scholar
  32. 32.
    Putnam, L.A.: general empirical solution to the macro software sizing, estimating problem. IEEE Trans. Softw. Eng. 4(4), 345 (1978)Google Scholar
  33. 33.
    Putnam, L.: A macro estimating methodology for software development (1976)Google Scholar
  34. 34.
    Hunt, B., Abolfotouh, T., Carpenter, J., Gioia, R.: Software test costs and ROI issues. University Lecture (2014)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Sahar Tahvili
    • 1
    • 2
    Email author
  • Markus Bohlin
    • 1
    Email author
  • Mehrdad Saadatmand
    • 1
    • 2
  • Stig Larsson
    • 1
  • Wasif Afzal
    • 2
  • Daniel Sundmark
    • 2
  1. 1.SICS Swedish ICTVästeråsSweden
  2. 2.Mälardalen UniversityVästeråsSweden

Personalised recommendations