Advertisement

From MC/DC to RC/DC: Formalization and Analysis of Control-Flow Testing Criteria

  • Sergiy A. Vilkomir
  • Jonathan P. Bowen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4949)

Abstract

This chapter describes an approach to the formalization of existing criteria used in computer systems software testing and proposes a Reinforced Condition/Decision Coverage (RC/DC) criterion. This criterion has been developed from the well-known Modified Condition/Decision Coverage (MC/DC) criterion and is more suitable for the testing of safety-critical software where MC/DC may not provide adequate assurance. As a formal language for describing the criteria, the Z notation has been selected. Formal definitions in the Z notation for RC/DC, as well as MC/DC and other criteria, are presented. Specific examples of using these criteria for specification-based testing are considered and some features are formally proved. This characterization is helpful in the understanding of different types of testing and also the correct application of a desired testing regime.

Keywords

Software Test Logical Expression Decision Coverage Main Track Atomic Predicate 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Abdurazik, A., Amman, P., Ding, W., Offutt, J.: Evaluation of Three Specification-based Testing Criteria. In: 6th IEEE International Conference on Engineering of Complex Computer Systems (ICECCS 2000), Tokyo, Japan (September 2000)Google Scholar
  2. 2.
    Ammann, P.E., Black, P.E.: Test Generation and Recognition with Formal Methods. In: The First International Workshop on Automated Program Analysis, Testing and Verification, Limerick, Ireland (June 2000)Google Scholar
  3. 3.
    Ammann, P., Offutt, J., Huang, H.: Coverage Criteria for Logical Expressions. In: Proceedings of the 14th International Symposium on Software Reliability Engineering (ISSRE 2003), Denver, Colorado, USA, November 17–20, 2003, pp. 99–107 (2003)Google Scholar
  4. 4.
    Applied Microsystems Corporation. DO-178B & ED-12B Software Verification using CodeTEST, http://www.amc.com/news/
  5. 5.
    Bishop, P.G.: MC/DC based estimation and detection of residual faults in PLC logic networks. In: Supplementary Proceedings 14th International Symposium on Software Reliability Engineering (ISSRE 2003, Fast Abstracts, Denver, Colorado, USA, November 17–20, 2003, pp. 297–298 (2003)Google Scholar
  6. 6.
    Bowen, J.P.: Formal Specification and Documentation using Z: A Case Study Approach. International Thomson Computer Press (1996)Google Scholar
  7. 7.
    Bowen, J.P.: Experience Teaching Z with Tool and Web Support. ACM SIGSOFT Software Engineering Notes 26(2), 69–75 (2001)CrossRefGoogle Scholar
  8. 8.
    Bowen, J.P., Bogdanov, K., Clark, J., Harman, M., Hierons, R.: FORTEST: Formal Methods and Testing. In: Proceedings of 26th Annual International Computer Software and Applications Conference (COMPSAC 2002), Oxford, UK, August 26–29, 2002, pp. 91–101. IEEE Computer Society Press, Los Alamitos (2002)Google Scholar
  9. 9.
    Bowen, J.P., Hinchey, M.G.: Industrial-Strength Formal Methods in Practice. FACIT series. Springer, Heidelberg (1999)Google Scholar
  10. 10.
    Bowen, J.P., Hinchey, M.G.: Formal Methods. In: Tucker, A.B. (ed.) Computer Science Handbook, 2nd edn., Ch. 106, pp. 1–25. Chapman & Hall/CRC, Boca Raton (2004)Google Scholar
  11. 11.
    Burton, S., Clark, J., Galloway, A., McDermid, J.: Automated V&V for High Integrity Systems, A Target Formal Methods Approach. In: Proceedings of the 5th NASA Langley Formal Methods Workshop (June 2000)Google Scholar
  12. 12.
    Burton, S., Clark, J., McDermid, J.: Proof and Automation: An Integrated Approach. In: Proceedings of the 1st International Workshop of Automated Program Analysis, Testing and Verification (June 2000)Google Scholar
  13. 13.
    Carrington, D., Stocks, P.: A Tale of Two Paradigms: Formal Methods and Software Testing. In: Bowen, J.P., Hall, J.A. (eds.) Z User Workshop, Cambridge, 1994, Workshops in Computing, pp. 51–68. Springer, Heidelberg (1994)Google Scholar
  14. 14.
    Chapman, R.: Industrial Experience with SPARK. In: Proceedings of ACM SIGAda Annual International Conference (SIGAda 2000), Johns Hopkins University/Applied Physics Laboratory, Laurel, Maryland, USA November 12–16 (2000)Google Scholar
  15. 15.
    Chilenski, J., Miller, S.: Applicability of Modified Condition/Decision Coverage to Software Testing. Software Engineering Journal, 193–200 (September 1994)Google Scholar
  16. 16.
    Chilenski, J., Newcomb, P.H.: Formal Specification Tool for Test Coverage Analysis. In: Proceedings of the Ninth Knowledge-Based Software Engineering Conference, September 20–23, 1994, pp. 59–68 (1994)Google Scholar
  17. 17.
    Chilenski, J., Richey, L.: Definition for a Masking form of Modified Condition Decision Coverage (MCDC), Boeing Report (December 1997)Google Scholar
  18. 18.
    Chilenski, J.: An Investigation of Three Forms of the Modified Condition Decision Coverage (MCDC) Criterion, Report DOT/FAA/AR-01/18 (April 2001)Google Scholar
  19. 19.
    DDC-I, Inc. The DACS-Object Coverage tools. MC/DC and the DACS-Object Coverage Tools, http://www.ddci.com/
  20. 20.
    DeWalt, M.: MCDC: A Blistering Love/Hate Relationship. In: FAA National Software Conference, Long Beach, California, USA, April 6–9(1999)Google Scholar
  21. 21.
    Dupuy, A., Leveson, N.: An Empirical Evaluation of the MC/DC Coverage Criterion on the HETE-2 Satellite Software. In: Proceedings of the Digital Aviation Systems Conference (DASC), Philadelphia, USA (October 2000)Google Scholar
  22. 22.
    FAA Certification Authorities Software Team (CAST). Position Paper CAST-6, Rationale for Accepting Masking MC/DC in Certification Projects (August 2001)Google Scholar
  23. 23.
    Frankl, P.G., Weyuker, E.J.: A Formal Analysis of the Fault-Detecting Ability of Testing Methods. IEEE Transactions on Software Engineering 19(3), 202–213 (1993)CrossRefGoogle Scholar
  24. 24.
    Galloway, A., Paige, R.F., Tudor, N.J., Weaver, R.A., Toyn, I., McDermid, J.: Proof vs testing in the context of safety standards. In: Proceedings of the 24th Digital Avionics Systems Conference (DASC 2005), Washington DC, USA, October 30 – November 3, vol. 2 (2005)Google Scholar
  25. 25.
    Haworth, B.: Adequacy Criteria for Object Testing. In: Proceedings of the 2nd International Software Quality Week Europe 1998, Brussels, Belgium (November 1998)Google Scholar
  26. 26.
    Hayes, I.J.: Specification Directed Module Testing. IEEE Transactions on Software Engineering SE-12(1), 124–133 (1986)Google Scholar
  27. 27.
    Hayhurst, K.J., Veerhusen, D.S., Chilenski, J.J., Rierson, L.K.: A Practical Tutorial on Modified Condition/Decision Coverage, Report NASA/TM-2001-210876, NASA, USA (May 2001)Google Scholar
  28. 28.
    Hayhurst, K.J., Veerhusen, D.S.: A Practical Approach to Modified Condition/Decision Coverage. In: 20th Digital Avionics Systems Conference (DASC), Daytona Beach, Florida, USA, October 14–18, 2001, pp. 1B2/1–1B2/10 (2001)Google Scholar
  29. 29.
    Hörcher, H.-M.: Improving Software Tests using Z Specifications. In: P. Bowen, J., Hinchey, M.G. (eds.) ZUM 1995. LNCS, vol. 967, pp. 152–166. Springer, Heidelberg (1995)Google Scholar
  30. 30.
    ISO/IEC. Information technology – Z formal specification notation – Syntax, type system and semantics. ISO/IEC 13568, International Organization for Standardization (2002)Google Scholar
  31. 31.
    Jacky, J.: The Way of Z: Practical Programming with Formal Methods. Cambridge University Press, Cambridge (1997)Google Scholar
  32. 32.
    Jasper, R., Brennan, M., Williamson, K., Currier, B., Zimmerman, D.: Test Data Generation and Feasible Path Analysis. In: Proceedings of 1994 International Symposium on Software Testing and Analysis, Seattle, Washington, USA, August 17–19, 1994, pp. 95–107 (1994)Google Scholar
  33. 33.
    Jia, X.: ZTC: A Type Checker for Z Notation. User’s Guide. Version 2.03, August 1998. Division of Software Engineering, School of Computer Science, Telecommunication and Information Systems, DePaul University, USA (1998)Google Scholar
  34. 34.
    Jones, J., Harrold, M.: Test-Suite Reduction and Prioritization for Modified Condition/Decision Coverage. In: Proceedings of the IEEE International Conference on Software Maintenance (ICSM 2001), Florence, Italy, November 7–9, 2001, pp. 92–101 (2001)Google Scholar
  35. 35.
    Kapoor, K., Bowen, J.P.: Experimental Evaluation of the Variation in Effectiveness for DC, FPC and MC/DC Test Criteria. In: Proceedings of ACM-IEEE 2003 International Symposium on Empirical Software Engineering (ISESE 2003), Rome, Italy, September 30 – October 1, 2003, pp. 185–194. IEEE Computer Society Press, Los Alamitos (2003)CrossRefGoogle Scholar
  36. 36.
    Kapoor, K., Bowen, J.P.: A Formal Analysis of MCDC and RCDC Test Criteria. Software Testing, Verification and Reliability 15(1), 21–40 (2005)CrossRefGoogle Scholar
  37. 37.
    Kaufman, A.V., Chernonozhkin, S.K.: Testing Criteria and a System for Evaluation of the Completeness of a Test Set. Programming and Computer Software 6, 301–311 (1998)Google Scholar
  38. 38.
    Kuhn, D.: Fault Classes and Error Detection Capability of Specification-Based Testing. ACM Transactions On Software Engineering and Methodology 8(4), 411–424 (1999)CrossRefGoogle Scholar
  39. 39.
    LDRA Ltd. Modified Condition/Decision Coverage with LDRA Testbed, http://www.ldra.co.uk/pages/mcdc.asp
  40. 40.
    Li, Y.Y.: Structural Test Cases Analysis and Implementation. 42nd Midwest Symposium on Circuits and Systems 2(8–11), 882–885 (1999)Google Scholar
  41. 41.
    Myers, G.: The Art of Software Testing. Wiley-Interscience, Chichester (1979)Google Scholar
  42. 42.
    Ntafos, S.: A Comparison of Some Structural Testing Strategies. IEEE Transactions on Software Engineering 14(6), 868–874 (1988)CrossRefGoogle Scholar
  43. 43.
    Offutt, A.J., Xiong, Y., Liu, S.: Criteria for Generating Specification-Based Tests. In: Proceedings of Fifth IEEE International Conference on Engineering of Complex Computer Systems (ICECCS 1999), Las Vegas, Nevada, USA, October 18–21, 1999, pp. 119–129 (1999)Google Scholar
  44. 44.
    Podgurski, P., Clarke, L.: A Formal Model of Program Dependences and its Implications for Software Testing, Debugging and Maintenance. IEEE Transactions on Software Engineering 16(9), 965–979 (1990)CrossRefGoogle Scholar
  45. 45.
    Prather, R.E.: Theory of Program Testing – An Overview. Bell System Technical Journal 62(10), 3073–3105 (1984)Google Scholar
  46. 46.
    Rapps, S., Weyuker, E.J.: Selecting Software Test Data Using Data Flow Information. IEEE Transactions on Software Engineering SE-11(4), 367–375 (1985)CrossRefGoogle Scholar
  47. 47.
    Roper, M.: Software Testing. McGraw-Hill, New York (1994)Google Scholar
  48. 48.
    RTCA. Software Considerations in Airborne Systems and Equipment Certification. DO-178B, RTCA, Washington DC, USA (1992)Google Scholar
  49. 49.
    Spivey, J.M.: The Z Notation: A Reference Manual, 2nd edn. International Series in Computer Science. Prentice-Hall, Englewood Cliffs (1992)Google Scholar
  50. 50.
    Stocks, P., Carrington, D.: A Framework for Specification-Based Testing. IEEE Transactions on Software Engineering 22(11), 777–793 (1996)CrossRefGoogle Scholar
  51. 51.
    Tai, K.-C.: Theory of Fault-Based Predicate Testing for Computer Programs. IEEE Transactions on Software Engineering 22(8), 552–562 (1996)CrossRefGoogle Scholar
  52. 52.
    Vilkomir, S.A., Bowen, J.P.: Formalization of Software Testing Criteria Using the Z Notation. In: Proceedings of COMPSAC 2001: 25th IEEE Annual International Computer Software and Applications Conference, Chicago, Illinois, USA, October 8–12, 2001, pp. 351–356. IEEE Computer Society Press, Los Alamitos (2001)CrossRefGoogle Scholar
  53. 53.
    Vilkomir, S.A., Bowen, J.P.: Reinforced Condition/Decision Coverage (RC/DC): A New Criterion for Software Testing. In: Bert, D., Bowen, J.P., C. Henson, M., Robinson, K. (eds.) B 2002 and ZB 2002. LNCS, vol. 2272, Springer, Heidelberg (2002)CrossRefGoogle Scholar
  54. 54.
    Vilkomir, S.A., Bowen, J.P.: Establishing Formal Regulatory Requirements for Safety-Critical Software Certification. In: Proceedings of AQuIS 2002: 5th International Conference on Achieving Quality In Software and SPICE 2002: 2nd International Conference on Software Process Improvement and Capability Determination, Venice, Italy, March 13–15, 2002, pp. 7–18 (2002)Google Scholar
  55. 55.
    Vilkomir, S.A., Bowen, J.P.: From MC/DC to RC/DC: Formalization and Analysis of Control-Flow Testing Criteria. Formal Aspects of Computing, Vol. 18 (2006), DOI: 10.1007/s00165-005-0084-7Google Scholar
  56. 56.
    Vilkomir, S.A., Ghose, A.: Development of a normative package for safety-critical software using formal regulatory requirements. In: Bomarius, F., Iida, H. (eds.) PROFES 2004. LNCS, vol. 3009, pp. 523–537. Springer, Heidelberg (2004)Google Scholar
  57. 57.
    Vilkomir, S.A., Kapoor, K., Bowen, J.P.: Tolerance of Control-Flow Testing Criteria. In: Proceedings of 27th IEEE Annual International Computer Software and Applications Conference (COMPSAC 2003), Dallas, Texas, USA, November 3–6, 2003, pp. 182–187. IEEE Computer Society Press, Los Alamitos (2003)CrossRefGoogle Scholar
  58. 58.
    Vilkomir, S.A., Kharchenko, V.S.: An ‘Asymmetric’ Approach to the Assessment of Safety-Critical Software during Certification and Licensing. In: Project Control: The Human Factor, Proceedings of ESCOM–SCOPE 2000 Conference, Munich, Germany, April 18–20, 2000, pp. 467–475 (2000)Google Scholar
  59. 59.
    Vilkomir, S.A., Kharchenko, V.S.: Methodology of the Review of Software for Safety Important Systems. In: Safety and Reliability. Proceedings of ESREL 1999 – The Tenth European Conference on Safety and Reliability, Munich-Garching, Germany, September 13–17, 1999, vol. 1, pp. 593–596 (1999)Google Scholar
  60. 60.
    Voas, J., Ghosh, A., Charron, F., Kassab, L.: Reducing Uncertainty About Common-Mode Failures. In: Proceedings of the Eighth International Symposium on Software Reliability Engineering (ISSRE 1997), Albuquerque, New Mexico, USA (November 1997)Google Scholar
  61. 61.
    Voznessensky, V., Berkovich, V.: VVER 440 and VVER-1000: Design Features in Comparison with Western PWRS. In: International Conference on Design and Safety of Advanced Nuclear Power Plants, vol. 4 (October 1992)Google Scholar
  62. 62.
    Weyuker, E., Goradia, T., Singh, A.: Automatically Generating Test Data from a Boolean Specification. IEEE Transactions on Software Engineering 20(5), 353–363 (1994)zbMATHCrossRefGoogle Scholar
  63. 63.
    White, A.L.: Comments on Modified Condition/Decision Coverage for Software Testing. In: 2001 IEEE Aerospace Conference Proceedings, Big Sky, Montana, USA, March 10–17, 2001, vol. 6, pp. 2821–2828 (2001)Google Scholar
  64. 64.
    Woodward, M.R., Hedley, D., Hennell, M.A.: Experience with Path Analysis and Testing of Programs. IEEE Transactions on Software Engineering SE-6(3), 278–286 (1980)CrossRefGoogle Scholar
  65. 65.
    Zhu, H., Hall, P.A., May, H.R.: Software Unit Test Coverage and Adequacy. ACM Computing Surveys 29(4), 336–427 (1997)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Sergiy A. Vilkomir
    • 1
  • Jonathan P. Bowen
    • 2
  1. 1.Software Quality Research Laboratory (SQRL) Department of Electrical Engineering and Computer ScienceThe University of TennesseeKnoxvilleUSA
  2. 2.Centre for Research on Evolution, Search and Testing (CREST) Department of Computer ScienceKing’s College London StrandLondonUK

Personalised recommendations