Advertisement

A Diagnosis Framework for Critical Systems Verification (Short Paper)

  • Vincent LeildéEmail author
  • Vincent Ribaud
  • Ciprian Teodorov
  • Philippe Dhaussy
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10469)

Abstract

For critical systems design, the verification tasks play a crucial role. If abnormalities are detected, a diagnostic process must be started to find and understand the root causes before corrective actions are applied. Detection and diagnosis are notions that overlap in common speech. Detection basically means to identify something as unusual, diagnosis means to investigate its root cause. The meaning of diagnosis is also fuzzy, because diagnosis is either an activity - an investigation - or an output result - the nature or the type of a problem. This paper proposes an organizational framework for structuring diagnoses around three principles: that propositional data (including detection) are the inputs of the diagnostic system; that activities are made of methods and techniques; and that associations specialize that relationships between the two preceding categories.

Keywords

Diagnosis Verification Critical systems Framework 

References

  1. 1.
    Dictionary—Merriam-Webster. https://www.merriam-webster.com
  2. 2.
    Alrajeh, D., Kramer, J., Russo, A., Uchitel, S.: Automated support for iagnosis and repair. Commu. ACM 58(2), 65–72 (2015)CrossRefGoogle Scholar
  3. 3.
    Aviienis, A., Laprie, J.C., Randell, B., Landwehr, C.: Basic concepts and axonomy of dependable and secure computing. IEEE Trans. Dependable Secure Comput. 1(1), 11–33 (2004)CrossRefGoogle Scholar
  4. 4.
    Baier, C., Katoen, J.P.: Principles of Model Checking. MIT Press, Cambridge (2008)zbMATHGoogle Scholar
  5. 5.
    Ball, T., Naik, M., Rajamani, S.K.: From symptom to cause: localizing errors in counterexample traces. In: ACM SIGPLAN, vol. 38. ACM (2003)CrossRefGoogle Scholar
  6. 6.
    Bertoli, P., Bozzano, M., Cimatti, A.: A symbolic model checking framework for safety analysis, diagnosis, and synthesis. In: Edelkamp, S., Lomuscio, A. (eds.) MoChArt 2006. LNCS, vol. 4428, pp. 1–18. Springer, Heidelberg (2007). doi: 10.1007/978-3-540-74128-2_1CrossRefzbMATHGoogle Scholar
  7. 7.
    Bourahla, M.: Model-based diagnostic using model checking. In: 2009 Fourth International Conference on Dependability of Computer Systems, Brunow, pp. 229–236 (2009). doi: 10.1109/DepCoS-RELCOMEX.2009.33
  8. 8.
    Buccafurri, F., Eiter, T., Gottlob, G., Leone, N.: Enhancing model checking in verification by AI techniques. Artif. Intell. 112(1), 57–104 (1999)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Cleve, H., Zeller, A.: Locating causes of program failures, p. 342. ACM Press (2005)Google Scholar
  10. 10.
    Groce, A., Visser, W.: What went wrong: explaining counterexamples. In: Ball, T., Rajamani, S.K. (eds.) SPIN 2003. LNCS, vol. 2648, pp. 121–136. Springer, Heidelberg (2003). doi: 10.1007/3-540-44829-2_8CrossRefGoogle Scholar
  11. 11.
    Gromov, M., Willemse, T.A.C.: Testing and model-checking techniques for diagnosis. In: Petrenko, A., Veanes, M., Tretmans, J., Grieskamp, W. (eds.) FATES/TestCom -2007. LNCS, vol. 4581, pp. 138–154. Springer, Heidelberg (2007). doi: 10.1007/978-3-540-73066-8_10CrossRefGoogle Scholar
  12. 12.
    Hamou-Lhadj, A., Lethbridge, T.C.: A survey of trace exploration tools and techniques. In: CASCON 2004, pp. 42–55. IBM Press (2004)Google Scholar
  13. 13.
    Holzmann, G.J.: The theory and practice of a formal method: NewCoRe. In: IFIP Congress (1), pp. 35–44 (1994)Google Scholar
  14. 14.
    Leilde, V., Ribaud, V., Dhaussy, P.: An organizing system to perform and enable verification and diagnosis activities. In: Yin, H., Gao, Y., Li, B., Zhang, D., Yang, M., Li, Y., Klawonn, F., Tallón-Ballesteros, A.J. (eds.) IDEAL 2016. LNCS, vol. 9937, pp. 576–587. Springer, Cham (2016). doi: 10.1007/978-3-319-46257-8_62CrossRefGoogle Scholar
  15. 15.
    Liu, Y., Xu, C., Cheung, S.: AFChecker: effective model checking for context-aware adaptive applications. J. Syst. Softw. 86(3), 854–867 (2013)CrossRefGoogle Scholar
  16. 16.
    Mackie, J.L.: The Cement of the Universe: A study of causation. Clarendon Library of Logic and Philosophy, 5. dr. edn. Clarendon Press, Oxford (1990). oCLC: 258760915Google Scholar
  17. 17.
    Pothier, G., Tanter, É., Piquer, J.: Scalable omniscient debugging. ACM SIGPLAN Not. 42(10), 535–552 (2007)CrossRefGoogle Scholar
  18. 18.
    Reiter, R.: A theory of diagnosis from first principles. Artif. Intell. 32, 57–95 (1987)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Ruys, T.C., Brinksma, E.: Managing the verification trajectory. Int. J. Softw. Tools Technol. Transf. (STTT) 4(2), 246–259 (2003)CrossRefGoogle Scholar
  20. 20.
    Venkatasubramanian, V., Rengaswamy, R., Kavuri, S.N.: A review of process fault detection and diagnosis. Comput. Chem. Eng. 27(3), 293–311 (2003)CrossRefGoogle Scholar
  21. 21.
    Visser, W., Havelund, K., Brat, G., Park, S., Lerda, F.: Model checking programs. Autom. Softw. Eng. 10(2), 203–232 (2003)CrossRefGoogle Scholar
  22. 22.
    Wotawa, F., Rodriguez-Roda, I., Comas, J.: Abductive reasoning in environmental decision support systems. In: AIAI workshops, pp. 270–279. Citeseer (2009)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Vincent Leildé
    • 1
    Email author
  • Vincent Ribaud
    • 2
  • Ciprian Teodorov
    • 1
  • Philippe Dhaussy
    • 1
  1. 1.Lab-STICC, Team MOCSENSTA-BretagneBrestFrance
  2. 2.Lab-STICC, Team MOCSUniversité de Bretagne OccidentaleBrestFrance

Personalised recommendations