Arguing on Software-Level Verification Techniques Appropriateness

  • Carmen CârlanEmail author
  • Barbara Gallina
  • Severin Kacianka
  • Ruth Breu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10488)


In this paper, we investigate the pondered selection of innovative software verification technology in the safety-critical domain and its implications. Verification tools perform analyses, testing or simulation activities. The compliance of the techniques implemented by these tools to fulfill standard-mandated objectives (i.e., to be means of compliance in the context of DO-178C and related supplements) should be explained to the certification body. It is thereby difficult for practitioners to use novel techniques, without a systematic method for arguing their appropriateness. Thus, we offer a method for arguing the appropriate application of a certain verification technique (potentially in combination with other techniques) to produce the evidence needed to satisfy certification objectives regarding fault detection and mitigation in a realistic avionics application via safety cases. We use this method for the choice of an appropriate compiler to support the development of a drone.


Safety cases Faults Standard compliance Verification techniques 



This work has been partially sponsored by the Austrian Ministry for Transport, Innovation and Technology (IKT der Zukunft, Project SALSA) and the Munich Center for Internet Research (MCIR). The author B. Gallina is financially supported by the ECSEL JU project AMASS (No. 692474).


  1. 1.
    Ayoub, A., Kim, B.G., Lee, I., Sokolsky, O.: A systematic approach to justifying sufficient confidence in software safety arguments. In: Ortmeier, F., Daniel, P. (eds.) SAFECOMP 2012. LNCS, vol. 7612, pp. 305–316. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-33678-2_26 CrossRefGoogle Scholar
  2. 2.
    Bennion, M., Habli, I.: A Candid industrial evaluation of formal software verification using model checking. In: Companion Proceedings of the 36th International Conference on Software Engineering, pp. 175–184. ACM, New York (2014)Google Scholar
  3. 3.
    Bloomfield, R.E., Bishop, P.G.: Safety and assurance cases: past, present and possible future - an Adelard perspective. In: Dale, C., Anderson, T. (eds.) Making Systems Safer, pp. 51–67. Springer, London (2010). doi: 10.1007/978-1-84996-086-1_4 CrossRefGoogle Scholar
  4. 4.
    Bourdil, P.A., Dal Zilio, S., Jenn, E.: Integrating model checking in an industrial verification process: a structuring approach. Working Paper or Preprint (2016).
  5. 5.
    Cârlan, C., Beyene, T.A., Ruess, H.: Integrated formal methods for constructing assurance cases. In: Proceedings of International Symposium on Software Reliability Engineering Workshops, pp. 221–228. IEEE (2016)Google Scholar
  6. 6.
    Cârlan, C., Ratiu, D., Schätz, B.: On using results of code-level bounded model checking in assurance cases. In: Skavhaug, A., Guiochet, J., Schoitsch, E., Bitsch, F. (eds.) SAFECOMP 2016. LNCS, vol. 9923, pp. 30–42. Springer, Cham (2016). doi: 10.1007/978-3-319-45480-1_3 CrossRefGoogle Scholar
  7. 7.
    Denney, E., Pai, G.: Evidence arguments for using formal methods in software certification. In: Proceedings of International Symposium on Software Reliability Engineering Workshops, pp. 375–380. IEEE (2013)Google Scholar
  8. 8.
    Gallina, B.: A model-driven safety certification method for process compliance. In: Proceedings of International Symposium on Software Reliability Engineering Workshops, pp. 204–209. IEEE (2014)Google Scholar
  9. 9.
    Gallina, B., Andrews, A.: Deriving verification-related means of compliance for a model-based testing process. In: Proceedings of IEEE/AIAA 35th Digital Avionics Systems Conference, pp. 1–6 (2016)Google Scholar
  10. 10.
    Gallina, B., Kashiyarandi, S., Zugsbratl, K., Geven, A.: Enabling cross-domain reuse of tool qualification certification artefacts. In: Bondavalli, A., Ceccarelli, A., Ortmeier, F. (eds.) SAFECOMP 2014. LNCS, vol. 8696, pp. 255–266. Springer, Cham (2014). doi: 10.1007/978-3-319-10557-4_28 Google Scholar
  11. 11.
    Gallina, B., Pitchai, K.R., Lundqvist, K.: S-TunExSPEM: Towards an extension of SPEM 2.0 to model and exchange tunable safety-oriented processes. In: Proceedings of the 11th International Conference on Software Engineering Research, Management and Applications. pp. 215–230. Springer SCI (2014)Google Scholar
  12. 12.
    Garfinkel, S.: History’s worst software bugs (2005).
  13. 13.
    Goodenough, J., Weinstock, C.B., Klein, A.Z.: Toward a theory of assurance case confidence. Technical report CMU/SEI-2012-TR-002, Software Engineering Institute, Pittsburgh, PA, USA (2012)Google Scholar
  14. 14.
    Graydon, G., Knight, J.: Process synthesis in assurance-based development of dependable systems. In: Proceedings of 8th European Dependable Computing Conference, pp. 75–84. IEEE (2010)Google Scholar
  15. 15.
    Habli, I., Kelly, T.: A generic goal-based certification argument for the justification of formal analysis. Electron. Notes Theor. Comput. Sci. 238(4), 27–39 (2009)CrossRefGoogle Scholar
  16. 16.
    Hawkins, R., Kelly, T.: A structured approach to selecting and justifying software safety evidence. In: Proceedings of 5th International Conference on System Safety, pp. 1–6. IET (2010)Google Scholar
  17. 17.
    Höller, A., Kajtazovic, N., Rauter, T., Römer, K., Kreiner, C.: Evaluation of diverse compiling for software-Fault Detection. In: Proceedings of the Design, Automation and Test in Europe Conference and Exhibition, pp. 531–536. IEEE (2015)Google Scholar
  18. 18.
    Holloway, C.M.: Explicate’78: uncovering the implicit assurance case in DO-178C. Technical report 20150009473, NASA Langley Research Center (2015)Google Scholar
  19. 19.
    Object Managment Group: Structured assurance case metamodel - SACM, version 2.0 Beta. Technical report (2016).
  20. 20.
    Quigley, M., Gerkey, B., Conley, K., Faust, J., Foote, T., Leibs, J., Berger, E., Wheeler, R., Ng, A.: ROS: an open-source robot operating system. In: Proceedings of Open-Source Software Workshop International Conference on Robotics and Automation, vol. 3. IEEE (2009)Google Scholar
  21. 21.
    RTCA: DO-178C, software considerations in airborne systems and equipment certification. RTCA & EUROCAE (2011)Google Scholar
  22. 22.
    RTCA: DO-333 formal methods supplement to DO-178C and DO-278A. RTCA & EUROCAE (2011)Google Scholar
  23. 23.
    Shiraishi, S., Mohan, V., Marimuthu, H.: Test suites for benchmarks of static analysis tools. In: Proceedings of International Symposium on Software Reliability Engineering Workshops, pp. 12–15. IEEE (2015)Google Scholar
  24. 24.
    Weaver, R., McDermid, J., Kelly, T.: Software safety arguments: towards a systematic categorisation of evidence. In: Proceedings of the 20th International System Safety Conference. System Safety Society (2002)Google Scholar
  25. 25.
    Wei, C., Xiaohong, B., Tingdi, Z.: A study on compiler selection in safety-critical redundant system based on airworthiness requirement. Procedia Eng. 17, 497–504 (2011)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Carmen Cârlan
    • 1
    Email author
  • Barbara Gallina
    • 2
  • Severin Kacianka
    • 3
  • Ruth Breu
    • 4
  1. 1.fortiss GmbHMunichGermany
  2. 2.Mälardalen UniversityVästeråsSweden
  3. 3.Technische Universität MünchenGarchingGermany
  4. 4.Institut für InformatikInnsbruckAustria

Personalised recommendations