Advertisement

Requirements Engineering

, Volume 23, Issue 2, pp 291–329 | Cite as

Hazard Relation Diagrams: a diagrammatic representation to increase validation objectivity of requirements-based hazard mitigations

  • Bastian Tenbergen
  • Thorsten Weyer
  • Klaus Pohl
Original Article

Abstract

When developing safety-critical embedded systems, it is necessary to ensure that the system under development poses no harm to human users or external systems during operation. To achieve this, potential hazards are identified and potential mitigations for those hazards are documented in requirements. During requirements validation, the stakeholders assess if the documented hazard-mitigating requirements can avoid the identified hazards. Requirements validation is highly subjective. Among others, validation depends on the stakeholders’ understanding of the involved processes, their familiarity with the system under development, and the information available. In consequence, there is the risk that stakeholders judge the adequacy of hazard-mitigating requirements based on their individual opinions about the hazards, rather than on the documented information about the system’s hazards. To improve the validation of hazard-mitigating requirements, we recently proposed a diagrammatic representation called Hazard Relation Diagrams (Tenbergen B, Weyer T, Pohl K, Supporting the validation of adequacy in requirements-based hazard mitigations. In: Requirements engineering: foundation for software quality. LNCS, vol 9013. Springer, pp 17–32, 2015). In this paper, we extend the ontology of Hazard Relation Diagrams, present their notations, and define well-formedness rules. We elaborate on the application of Hazard Relation Diagrams to visualize complex relationships between hazards and mitigations and present an automated approach to generate Hazard Relation Diagrams. Finally, we report on our empirical evaluations about the impact of Hazard Relation Diagrams on review objectivity, effectiveness, efficiency, and reviewer’s subjective confidence.

Keywords

Safety requirements Hazards Hazard-mitigating requirements Safety assessment Validation Reviews Inspections Mitigation Adequacy Modeling Safety-critical embedded systems Model-based engineering Hazard Relation Diagrams 

Notes

Acknowledgements

This research was partly funded by the German Federal Ministry of Education and Research (BMBF) under Grant number 01IS12005C (project SPES XT). We thank Arnaud Boyer (Airbus Defence and Space) for his consultation regarding FHA. We thank Dr. Frank Houdek (Daimler AG) as well as Peter Heidl and Jens Höfflinger (Robert Bosch GmbH) for their consultation on the adaptive cruise control system. We thank Dr. Kai Petersen of Blekinge Institute of Technology as well as Marian Daun and André Heuer (University of Duisburg-Essen) for their feedback on the study design. We also thank all our participants in the experiments.

References

  1. 1.
    Leveson NG (2011) Engineering a safer world: systems thinking applied to safety. Engineering systems. MIT Presss, CambridgeGoogle Scholar
  2. 2.
    Leveson NG (1995) Safeware: system safety and computers. Addison-Wesley, Reading, MassGoogle Scholar
  3. 3.
    SAE International (1996) ARP4761, Guidelines and methods for conducting the safety assessment process on civil airborne systems and equipment. http://standards.sae.org/arp4761/. Accessed 7 Jan 2016
  4. 4.
    International Organization for Standardization (2011) ISO26262, Road vehicles—functional. safety. http://www.iso.org/iso/catalogue_detail?csnumber=43464. Accessed 7 Jan 2016
  5. 5.
    Ericson CA (2005) Hazard analysis techniques for system safety. Wiley, HobokenCrossRefGoogle Scholar
  6. 6.
    Bishop P, Bloomfield R, Guerra S (2004) The future of goal-based assurance cases. In: proceedings of the workshop on assurance cases, pp 390–395Google Scholar
  7. 7.
    Wilson SP, Kelly TP, McDermid JA (1997) Safety case development: current practice, future prospects. In: Shaw R (ed) Safety and reliability of software based systems. Springer, London, pp 135–156CrossRefGoogle Scholar
  8. 8.
    IEEE Standards Board (1990) IEEE Std. 610.12: IEEE Standard Glossary of Software Engineering Terminology, 1990Google Scholar
  9. 9.
    Leveson N (2011) The use of safety cases in certification and regulation. J Syst Saf 47(6). http://goo.gl/j9NW5Y. Accessed 13 July 2016
  10. 10.
    Firesmith D (2004) Engineering safety requirements, safety constraints, and safety-critical requirements. J Object Technol 3(3):27–42. doi: 10.5381/jot.2004.3.3.c3 CrossRefGoogle Scholar
  11. 11.
    Hatcliff J, Wassyng A, Kelly T et al. (2014) Certifiably safe software-dependent systems: challenges and directions. In: proceedings of the future software engineering, pp 182–200Google Scholar
  12. 12.
    Glinz M (2000) Improving the Quality of Requirements with Scenarios. In: proceedings of the 2nd world congress on software quality, pp 55–60Google Scholar
  13. 13.
    Flynn DJ, Warhurst R (1994) An empirical study of the validation process within requirements determination. Inf Syst J 4(3):185–212. doi: 10.1111/j.1365-2575.1994.tb00051.x CrossRefGoogle Scholar
  14. 14.
    Lisagor O, Sun L, Kelly T (2010) The illusion of method: challenges of model-based safety assessment. In: 28th international system safety conference (ISSC)Google Scholar
  15. 15.
    Sun L (2012) Establishing confidence in safety assessment evidence. Dissertation, University of YorkGoogle Scholar
  16. 16.
    Gacitua R, Ma L, Nuseibeh B, Piwek P, de Roeck AN, Rouncefield M, Sawyer P, Willia A, Yang H (2009) Making tacit requirements explicit. In: 2nd international workshop on managing requirements knowledge (MARK), pp 40–44Google Scholar
  17. 17.
    Glinz M, Fricker SA (2015) On shared understanding in software engineering: an essay. Comput Sci Res Dev 30(3–4):363–376. doi: 10.1007/s00450-014-0256-x CrossRefGoogle Scholar
  18. 18.
    Mao J, Chen L (2012) Runtime monitoring for cyber-physical Systems: a case study of cooperative adaptive cruise control. In: proceedings of the 2nd international conference on intellectual system design and engineering application, pp 509–515Google Scholar
  19. 19.
    Caramihai SI, Dumitrache I (2013) Urban traffic monitoring and control as a cyber-physical system approach. In: Dumitrache L (ed) Advances in intelligent control systems and computer science. Springer, Berlin, pp 355–366CrossRefGoogle Scholar
  20. 20.
    Lempia DL, Miller S (2009) Requirements engineering management findings report. Technical report DOT/FAA/AR-08/34, Federal Aviation AdministrationGoogle Scholar
  21. 21.
    Tenbergen B, Weyer T, Pohl K (2015) Supporting the validation of adequacy in requirements-based hazard mitigations. In: Requirements engineering: foundation for software quality. LNCS, vol 9013. Springer, pp 17–32Google Scholar
  22. 22.
    Heimdahl MP (2007) Safety and software intensive systems: challenges old and new. In: future of software engineering, pp 137–152Google Scholar
  23. 23.
    Stoneburner G (2006) Toward a unified security-safety model. Computer 39(8):96–97. doi: 10.1109/MC.2006.283 CrossRefGoogle Scholar
  24. 24.
    Kelly T (2007) Reviewing assurance arguments-a step-by-step approach. In: workshop on Assurance Cases for Security-The Metrics Challenge, Dependable Systems and Networks (DSN)Google Scholar
  25. 25.
    Johnson CW, Holloway CM (2006) Questioning the role of requirements engineering in the causes of safety-critical software failures. In: IET international conference on system safety, pp 352–361Google Scholar
  26. 26.
    Lempia DL, Miller S (2009) Requirements engineering management handbook. Technical report, DOT/FAA/AR-08/32, Federal Aviation AdministrationGoogle Scholar
  27. 27.
    Chung L, Nixon BA, Yu E et al (2000) Non-functional requirements in software engineering. International series in software engineering, vol 5. Springer, BerlinCrossRefzbMATHGoogle Scholar
  28. 28.
    Moody DL (2009) The “physics” of notation: toward a scientific basis for constructing visual notations in software engineering. IEEE Trans Softw Eng 35(6):756–779CrossRefGoogle Scholar
  29. 29.
    Störrle H (2004) Semantics of control-flow in UML 2.0 activities. In: IEEE symposium on visual languages and human centric computing, pp 235–242Google Scholar
  30. 30.
    Object Management Group (2015) OMG Unified Modeling Language, Version 2.5. OMG Document Number formal/2015-03-01. http://goo.gl/7cQyPv. Accessed 13 July 2016
  31. 31.
    van Lamsweerde A (2009) Requirements engineering: from system goals to UML models to software specifications. Wiley, ChichesterGoogle Scholar
  32. 32.
    Yu E (1997) Towards modelling and reasoning support for early-phase requirements engineering. In: international symposium RE, pp 226–235Google Scholar
  33. 33.
    Giorgini P, Mylopoulos J, Sebastiani R (2005) Goal-oriented requirements analysis and reasoning in the Tropos methodology. Eng Appl Artif Intell 18(2):159–171. doi: 10.1016/j.engappai.2004.11.017 CrossRefGoogle Scholar
  34. 34.
    Kelly T, Weaver R (2004) The goal structuring notation‐a safety argument notation. In: proceedings of the workshop on assurance cases of dependable systems and networksGoogle Scholar
  35. 35.
    Heim I, Kratzer A (1998) Semantics in generative grammar. Wiley, ChichesterGoogle Scholar
  36. 36.
    Finkelstein A, Kramer J, Nuseibeh B, Finkelstein L, Goedicke M (1992) Viewpoints: a framework for integrating multiple perspectives in system development. Int J Softw Eng Knowl Eng 2(1):31–58CrossRefGoogle Scholar
  37. 37.
    Conway M (1968) How do committees invent? Datamation 14(4):28–31Google Scholar
  38. 38.
    Reif K (2010) Fahrstabilisierungssysteme und Fahrerassistenzsysteme. Vieweg + Teubner, WiesbadenCrossRefGoogle Scholar
  39. 39.
    Object Management Group (2011) QVT: Meta Object Facility (MOF) 2.0 Query/View/Transformation, v1.1Google Scholar
  40. 40.
    QVT Operational Eclipse Plugin, v3.5.0. https://goo.gl/SglK1F. Accessed 7 Jan 2016
  41. 41.
    Eclipse Modeling Tools, Luna Package Distribution. https://goo.gl/qo9Sf5. Accessed 7 Jan 2016
  42. 42.
    Jedlitschka A, Ciolkowski M, Pfahl D (2008) Reporting experiments in software engineering. In: Shull F, Singer J, Sjøberg DIK (eds) Guide to advanced empirical software engineering. Springer, London, pp 201–228CrossRefGoogle Scholar
  43. 43.
    Wohlin C, Runeson P, Höst M et al (2012) Experimentation in software engineering. Springer, BerlinCrossRefzbMATHGoogle Scholar
  44. 44.
    Tabachnick BG, Fidell LS (2010) Using multivariate statistics, 5th edn. Pearson/Allyn and Bacon, BostonGoogle Scholar
  45. 45.
    SoSci Survey. https://www.soscisurvey.de. Accessed 7 January 2016
  46. 46.
    Venkatesh V, Bala H (2008) Technology acceptance model 3 and a research agenda on interventions. Decis Sci 39(2):273–315. doi: 10.1111/j.1540-5915.2008.00192.x CrossRefGoogle Scholar
  47. 47.
    Goodhue DL (1998) Development and measurement validity of a task-technology fit instrument for user evaluations of information System. Decis Sci 29(1):105–138. doi: 10.1111/j.1540-5915.1998.tb01346.x CrossRefGoogle Scholar
  48. 48.
    van Solingen R, Berghout E (1999) The goal/question/metric method: a practical guide for quality improvement of software development. The McGraw-Hill Companies, LondonGoogle Scholar
  49. 49.
    Osgood CE, Suci G, Tannenbaum P (1957) The measurement of meaning. University of Illinois Press, UrbanaGoogle Scholar
  50. 50.
    Verhagen T., Hooff B. van den Meents S. (2015) Toward a better use of the semantic differential in IS research: an integrative framework of suggested action. J Assoc Inf Syst 16(2):108–143Google Scholar
  51. 51.
    Corbin JM, Strauss AL (2008) Basics of qualitative research: Techniques and procedures for developing grounded theory, 3rd edn. Sage Publ, Los AngelesCrossRefGoogle Scholar
  52. 52.
    Cronbach LJ (1951) Coefficient alpha and the internal structure of tests. Psychometrika 16(3):297–334. doi: 10.1007/BF02310555 CrossRefzbMATHGoogle Scholar
  53. 53.
    Student (1908) The probable error of a mean. Biometrika 6(1):1–25. doi: 10.2307/2331554 CrossRefGoogle Scholar
  54. 54.
    Cohen J (1992) A power primer. Psychol Bull 112(1):155–159. doi: 10.1037/0033-2909.112.1.155 CrossRefGoogle Scholar
  55. 55.
    David HA, Gunnink JL (1997) The paired t test under artificial pairing. Am Stat 51(1):9–12. doi: 10.2307/2684684.JSTOR2684684 MathSciNetGoogle Scholar
  56. 56.
    Carver J, Jaccheri L, Morasca S, Shull F (2003) Issues in using students in empirical studies in software engineering education. In: proceedings 9th international software metrics symposium, pp 239–249Google Scholar
  57. 57.
    Hart C, Mulhall P, Berry A, Loughran J, Gunstone R (2000) What is the purpose of this experiment? Or can students learn something from doing experiments? J Res Sci Teach 37(7):655–675. doi: 10.1002/1098-2736(200009)37:7<655:AID-TEA3>3.0.CO;2-E CrossRefGoogle Scholar
  58. 58.
    Carver JC, Nagappan N, Page A (2008) The impact of educational background on the effectiveness of requirements inspections: an empirical study. IEEE Trans Softw Eng 34(6):800–812. doi: 10.1109/TSE.2008.49 CrossRefGoogle Scholar
  59. 59.
    Navarro E, Sanchez P, Letelier P, Pastor JA, Ramos I (2006) A goal-oriented approach for safety requirements specification. In: 13th annual IEEE international symposium and workshop on engineering computer based system, pp 319-326Google Scholar
  60. 60.
    Allenby K, Kelly T (2001) Deriving safety requirements using scenarios. In: 5th IEEE international symposium RE, pp 228–235Google Scholar
  61. 61.
    Chen D, Johansson R, Lönn H, Papadopoulos Y, Sandberg, A, Törner F, Törngren M (2008) Modelling support for design of safety-critical automotive embedded systems. In: proceedings of the 27th international conference computer safety, reliability and security, pp 72–85Google Scholar
  62. 62.
    Guillerm R, Demmou H, Sadou N (2011) Combining FMECA and fault trees for declining safety requirements of complex systems. In: Soares CG (ed) Advances in safety, reliability and risk management. CRC Press, pp 1287–1293Google Scholar
  63. 63.
    Hansen KM, Ravn AP, Stavridou V (1998) From safety analysis to software requirements. IIEEE Trans Softw Eng 24(7):573–584. doi: 10.1109/32.708570 CrossRefGoogle Scholar
  64. 64.
    Tsuchiya T, Terada H, Kusumoto S, Kikuno T Eun Mi Kim (1997) Derivation of safety requirements for safety analysis of object-oriented design documents. In: proceedings of the 21st annual international computer software and applications conference, pp 252–255Google Scholar
  65. 65.
    Troubitsyna E (2008) Elicitation and specification of safety requirements. In: proceedings of the 3rd international conference on systems, pp 202–207Google Scholar
  66. 66.
    Xu X, Bao X, Lu M, Chang W (2011) A study and application on airborne software safety requirements elicitation. In: proceedings of the 9th international conference on reliability, maintainability and safety, pp 710–716Google Scholar
  67. 67.
    van Lamsweerde A (2009) Reasoning about alternative requirements options. In: Borgida AT, Chaudhri VK, Giorgini P, Yu ES (eds) Conceptual modeling: Foundations and Applications. Springer, Heidelberg, pp 380–397Google Scholar
  68. 68.
    Sindre G (2007) A look at misuse cases for safety concerns. In: proceedings of the IFIP WG 8.1 conference, pp 252–266Google Scholar
  69. 69.
    Raspotnig C, Opdahl A (2013) Comparing risk identification techniques for safety and security requirements. J Syst Softw 86(4):1124–1151. doi: 10.1016/j.jss.2012.12.002 CrossRefGoogle Scholar
  70. 70.
    Leveson NG (2004) A systems-theoretic approach to safety in software-intensive systems. IEEE Trans Depend Secur Comput 1(1):66–86. doi: 10.1109/TDSC.2004.1 CrossRefGoogle Scholar
  71. 71.
    Shull F, Basili V, Boehm B, Winsor Brown A, Costa P, Lindvall M, Port D, Rus I, Tesoriero R, Zelkowitz M (2002) What we have learned about fighting defects. In: proceedings of the 8th international symposium on software metrics, pp 249–258Google Scholar
  72. 72.
    Boehm B, Basili VR (2001) Software defect reduction top 10 list. Computer 34(1):135–137. doi: 10.1109/2.962984 CrossRefGoogle Scholar
  73. 73.
    Basili VR, Green S, Laitenberger O, Lanubile F, Shull F, Sorumgard S, Zelkowitz M (1996) The empirical investigation of perspective-based reading. Empir Softw Eng 1(2):133–164. doi: 10.1007/BF00368702 CrossRefGoogle Scholar
  74. 74.
    Shull F, Rus I, Basili V (2000) How perspective-based reading can improve requirements inspections. Computer 33(7):73–79. doi: 10.1109/2.869376 CrossRefGoogle Scholar
  75. 75.
    Li Q, Boehm B, Yang Y, Wang Q (2011) A value-based review process for prioritizing artifacts. In: proceedings of the international conference on software system process, pp 13–23Google Scholar
  76. 76.
    Aurum A, Petersson H, Wohlin C (2002) State-of-the-art: software inspections after 25 years. Softw Test Verif Reliab 12(3):133–154. doi: 10.1002/stvr.243 CrossRefGoogle Scholar
  77. 77.
    Porter AA, Votta LG, Basili VR (1995) Comparing detection methods for software requirements inspections: a replicated experiment. IEEE Trans Softw Eng 21(6):563–575. doi: 10.1109/32.391380 CrossRefGoogle Scholar
  78. 78.
    Lee K, Boehm B (2005) Empirical results from an experiment on value-based review (VBR) processes. In: proceedings of the international symposium empirical software engineering, pp 3–12Google Scholar
  79. 79.
    Cruickshank KJ, Michael JB, Man-Tak Shing (2009) A validation metrics framework for safety-critical software-intensive systems. In: IEEE international conference system of systems engineering, pp 1–8Google Scholar
  80. 80.
    Michael JB, Shing MT, Cruickshank KJ, Redmond PJ (2010) Hazard analysis and validation metrics framework for system of systems software safety. IEEE Syst J 4(2):186–197. doi: 10.1109/JSYST.2010.2050159 CrossRefGoogle Scholar
  81. 81.
    Driskell SB, Murphy J, Michael JB, Man-Tak Shing (2010) Independent validation of software safety requirements for systems of systems. In: proceedings of the 5th international conference on system of systems engineering, pp 1–6Google Scholar
  82. 82.
    Belli F, Hollmann A, Nissanke N (2007) Modeling, analysis and testing of safety issues—an event-based approach and case study. In: proceedings of the 26th international conference on computer safety, reliability and security, pp 276–282Google Scholar
  83. 83.
    Bitsch F (2001) Safety patterns—the key to formal specification of safety requirements. In: proceedings of the 20th international conference on computer safety, reliability and security, pp 176–189Google Scholar
  84. 84.
    Bharadwaj R, Heitmeyer CL (1999) Model checking complete requirements specifications using abstraction. Autom Softw Eng 6(1):37–68. doi: 10.1023/A:1008697817793 CrossRefGoogle Scholar
  85. 85.
    Heitmeyer C, Kirby J, Labaw B et al (1998) Using abstraction and model checking to detect safety violations in requirements specifications. IEEE Trans Softw Eng 24(11):927–948. doi: 10.1109/32.730543 CrossRefGoogle Scholar
  86. 86.
    Zafar S, Dromey RG (2005) Integrating safety and security requirements into design of an embedded system. In: proceedings of the 12th Asia-Pacific software engineering conference, pp 629–636Google Scholar
  87. 87.
    Robertson S, Robertson J (2013) Mastering the requirements process: getting requirements right, 3rd edn. Addison-Wesley, Upper Saddle RiverGoogle Scholar

Copyright information

© Springer-Verlag London 2017

Authors and Affiliations

  • Bastian Tenbergen
    • 1
    • 2
  • Thorsten Weyer
    • 2
  • Klaus Pohl
    • 2
  1. 1.Department of Computer ScienceState University of New York at OswegoOswegoUSA
  2. 2.paluno – The Ruhr Institute for Software TechnologyUniversity of Duisburg-EssenEssenGermany

Personalised recommendations