A Systematic Approach to Requirements Driven Test Generation for Safety Critical Systems
We describe ongoing work into the generation of test cases for safety critical systems using Event-B and the Rodin toolset. Verification of software to DO-178C is a two stage process. First a suite of test cases must be validated against the system requirements (requirements coverage), and then the software implementation is verified using the validated test suite. During verification of the implementation structural coverage is also measured.
Our work focuses on the first step, the generation of test cases and their validation against the requirements. We construct closed-system models incorporating both the system to be tested and its environment. These models capture the system requirements, and describe the interactions between the system and its environment. In particular, safety constraints can be represented by invariants, and their preservation ensured through event guards. From these models test cases can be generated, and requirements coverage can be measured from model coverage.
KeywordsEvent-B STPA Safety Critical Systems Test Generation
Unable to display preview. Download preview PDF.
- 1.Abrial, J.-R.: The B-Book: Assigning Programs to Meanings. Cambridge University Press (1996)Google Scholar
- 2.Abrial, J.-R.: Modeling in Event-B: System and Software Engineering. Cambridge University Press (2010)Google Scholar
- 3.Abrial, J.-R., Butler, M., Hallerstede, S., Hoang, T.S., Mehta, F., Voisin, L.: Rodin: an open toolset for modelling and reasoning in Event-B. International Journal on Software Tools for Technology Transfer 12(6), 447–466 (2010)Google Scholar
- 4.ADVANCE: D5.2 - ADVANCE Process Integration II (2013), http://www.advance-ict.eu/files/AdvanceD5.2.pdf
- 5.Bernot, G., Gaudel, M.-C., Marre, B.: Software testing based on formal specifications: a theory and a tool. Software Engineering Journal 6(6), 387–405 (1991)Google Scholar
- 7.Colley, J., Butler, M.: A formal, systematic approach to STPA using Event-B refinement and proof. In: Dale, C., Anderson, T. (eds.) Assuring the Safety of Systems: Proceedings of the 21st Safety-Critical Systems Symposium (2013)Google Scholar
- 10.Gaudel, M.-C.: Testing can be formal, too. In: Mosses, P.D., Nielsen, M., Schwartzbach, M.I. (eds.) CAAP 1995, FASE 1995, and TAPSOFT 1995. LNCS, vol. 915, pp. 82–96. Springer, Heidelberg (1995)Google Scholar
- 11.Heimdahl, M., George, D., Weber, R.: Specification test coverage adequacy criteria = specification test generation inadequacy criteria? In: Proceedings Eighth IEEE International Symposium on High Assurance Systems Engineering, pp. 178–186 (2004)Google Scholar
- 13.International Software Testing Qualifications Board: Advanced Level Syllabus: Test Analyst (2012)Google Scholar
- 15.Leveson, N.G.: Engineering a Safer World: Systems Thinking Applied to Safety. The MIT Press (2011)Google Scholar
- 16.RTCA, Inc.: DO-178B, Software Considerations in Airborne Systems and Equipment Certification (December 1992)Google Scholar
- 17.RTCA, Inc.: DO-178C, Software Considerations in Airborne Systems and Equipment Certification (December 2011)Google Scholar
- 18.SAE International: ARP4754A, Guidelines for Development of Civil Aircraft and Systems (December 2010)Google Scholar