Advertisement

Optimal Test Suite Generation for Modified Condition Decision Coverage Using SAT Solving

  • Takashi KitamuraEmail author
  • Quentin Maissonneuve
  • Eun-Hye Choi
  • Cyrille Artho
  • Angelo Gargantini
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11093)

Abstract

Boolean expressions occur frequently in descriptions of computer systems, but they tend to be complex and error-prone in complex systems. The modified condition decision coverage (MCDC) criterion in system testing is an important testing technique for Boolean expression, as its usage mandated by safety standards such as DO-178 [1] (avionics) and ISO26262 [2] (automotive). In this paper, we develop an algorithm to generate optimal MCDC test suites for Boolean expressions. Our algorithm is based on SAT solving and generates minimal MCDC test suites. Experiments on a real-world avionics system confirm that the technique can construct minimal MCDC test suites within reasonable times, and improves significantly upon prior techniques.

Notes

Acknowledgment

We thank Hélène Waeselynck and anonymous reviewers whose comments have greatly improved this paper.

References

  1. 1.
    RTCA: DO-178C: Software considerations in airborne systems and equipment certification (2011)Google Scholar
  2. 2.
    ISO26262: International Organization for standardization, road vehicles - functional safety (2009)Google Scholar
  3. 3.
    Chilenski, J.J., Miller, S.P.: Applicability of modified condition/decision coverage to software testing. Softw. Eng. J. 9(5), 193–200 (1994)CrossRefGoogle Scholar
  4. 4.
    Ammann, P., Offutt, J.: Introduction to Software Testing. Cambridge University Press, Cambridge (2008)CrossRefGoogle Scholar
  5. 5.
    Kelly J.H., Dan S.V., John J.C., Leanna K.R.: A practical tutorial on modified condition/decision coverage. Technical report NASA/TM-2001-210876, NASA (2001)Google Scholar
  6. 6.
    Felbinger, H., Pill, I., Wotawa, F.: Classifying test suite effectiveness via model inference and ROBBDs. In: Aichernig, B.K., Furia, C.A. (eds.) TAP 2016. LNCS, vol. 9762, pp. 76–93. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-41135-4_5CrossRefGoogle Scholar
  7. 7.
    Bloem, R., Hein, D., Röck, F., Schumi, R.: Case study: automatic test case generation for a secure cache implementation. In: Blanchette, J.C., Kosmatov, N. (eds.) TAP 2015. LNCS, vol. 9154, pp. 58–75. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-21215-9_4CrossRefGoogle Scholar
  8. 8.
    Weyuker, E.J., Goradia, T., Singh, A.: Automatically generating test data from a boolean specification. IEEE Trans. Softw. Eng. 20(5), 353–363 (1994)CrossRefGoogle Scholar
  9. 9.
    Dupuy, A., Leveson, N.: An empirical evaluation of the MC/DC coverage criterion on the HETE-2 satellite software. In: Proceedings of the 19th Digital Avionics Systems Conference (DASC 2000) (2000)Google Scholar
  10. 10.
    Vilkomir, S., Starov, O., Bhambroo, R.: Evaluation of T-wise approach for testing logical expressions in software. In: Proceedings of the Sixth IEEE International Conference on Software Testing, Verification and Validation, ICST 2013 Workshops Luxembourg, pp. 249–256 (2013)Google Scholar
  11. 11.
    Kapoor, K., Bowen, J.P.: A formal analysis of MCDC and RCDC test criteria. Softw. Test., Verif. Reliab. 15(1) 21–40 (2005)Google Scholar
  12. 12.
    Sun, Y., Huang, X., Kroening, D.: Testing deep neural networks (2018). arXiv:1803.04792
  13. 13.
    Jones, J.A., Harrold, M.J.: Test-suite reduction and prioritization for modified condition/decision coverage. IEEE Trans. Softw. Eng. 29(3), 195–209 (2003)CrossRefGoogle Scholar
  14. 14.
    Arcaini, P., Gargantini, A., Riccobene, E.: How to optimize the use of SAT and SMT solvers for test generation of boolean expressions. Comput. J. 58(11), 2900–2920 (2015)CrossRefGoogle Scholar
  15. 15.
    Chilenski, J.J.: An investigation of three forms of the modified condition decision coverage (MCDC) criterion. Technical report DOT/FAA/AR-01/18, Office of Aviation Research (2001)Google Scholar
  16. 16.
    Vilkomir, S.A., Bowen, J.P.: From MC/DC to RC/DC: formalization and analysis of control-flow testing criteria. In: Formal Methods and Testing, an Outcome of the FORTEST Network, Revised Selected Papers, pp. 240–270 (2008)Google Scholar
  17. 17.
    Whalen, M.W., Gay, G., You, D., Heimdahl, M.P.E., Staats, M.: Observable modified condition/decision coverage. In: 35th International Conference on Software Engineering ICSE 2013, San Francisco, CA, USA, pp. 102–111 (2013)Google Scholar
  18. 18.
    Offutt, A.J., Liu, S., Abdurazik, A., Ammann, P.: Generating test data from state-based specifications. Softw. Test., Verif. Reliab. 13(1) pp. 25–53 (2003)Google Scholar
  19. 19.
    Bloem, R., Greimel, K., Koenighofer, R., Roeck, F.: Model-based MCDC testing of complex decisions for the Java Card Applet Firewall. In: Proceedings of the Fifth International Conference on Advances in System Testing and Validation Lifecycle (VALID 2013), pp. 1–6 (2013)Google Scholar
  20. 20.
    Hnich, B., Prestwich, S., Selensky, E.: Constraint-based approaches to the covering test problem. In: Faltings, B.V., Petcu, A., Fages, F., Rossi, F. (eds.) CSCLP 2004. LNCS (LNAI), vol. 3419, pp. 172–186. Springer, Heidelberg (2005).  https://doi.org/10.1007/11402763_13CrossRefGoogle Scholar
  21. 21.
    Yamada, A., Kitamura, T., Artho, C., Choi, E.H., Oiwa, Y., Biere, A.: Optimization of combinatorial testing by incremental SAT solving. In: Proceedings of the 8th IEEE International Conference on Software Testing, Verification and Validation (ICST 2015), pp. 1–10 (2015)Google Scholar
  22. 22.
    de Moura, L., Bjørner, N.: Z3: an efficient SMT solver. In: Ramakrishnan, C.R., Rehof, J. (eds.) TACAS 2008. LNCS, vol. 4963, pp. 337–340. Springer, Heidelberg (2008).  https://doi.org/10.1007/978-3-540-78800-3_24CrossRefGoogle Scholar
  23. 23.
    Heimdahl, M.P.E., Devaraj, G.: On the effect of test-suite reduction on automatically generated model-based tests. Autom. Softw. Eng. 14(1), 37–57 (2007)CrossRefGoogle Scholar
  24. 24.
    Gargantini, A., Heitmeyer, C.L.: Using model checking to generate tests from requirements specifications. In: Proceedings of the 7th European Conference of Software Engineering, Held Jointly with the 7th ACM SIGSOFT Symposium on the Foundations of Software Engineering (ESEC/FSE 1999), pp. 146–162 (1999)Google Scholar
  25. 25.
    Pandita, R., Xie, T., Tillmann, N., de Halleux, J.: Guided test generation for coverage criteria. In: Proceedings of the 26th IEEE International Conference on Software Maintenance (ICSM 2010), Timisoara, Romania, pp. 1–10 (2010)Google Scholar
  26. 26.
    Bokil, P., Darke, P., Shrotri, U., Venkatesh, R.: Automatic test data generation for C programs. In: Proceedings of the Third IEEE International Conference on Secure Software Integration and Reliability Improvement SSIRI 2009, pp. 359–368 (2009)Google Scholar
  27. 27.
    Awedikian, Z., Ayari, K., Antoniol, G.: MC/DC automatic test input data generation. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2009), Canada, pp. 1657–1664 (2009)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Takashi Kitamura
    • 1
    Email author
  • Quentin Maissonneuve
    • 1
    • 2
  • Eun-Hye Choi
    • 1
  • Cyrille Artho
    • 3
  • Angelo Gargantini
    • 4
  1. 1.National Institute of Advanced Industrial Science and Technology (AIST)OsakaJapan
  2. 2.University of NantesNantesFrance
  3. 3.KTH Royal Institute of TechnologyStockholmSweden
  4. 4.Università di BergamoBergamoItaly

Personalised recommendations