Skip to main content

Quantitative Perspectives for Human Performance and Risk

  • Chapter
  • First Online:
Human-Automation Interaction

Abstract

Human performance is essential to the safe operation of systems for a wide range of industries including nuclear, chemical, oil and gas, aviation, and aerospace. To understand risk and manage safety, the scenarios that lead to hazards for people, equipment, and the environment are evaluated in risk assessments. In probabilistic risk assessments (PRAs), probabilities are estimated for scenarios and their elements to support the prioritization of risk contributors and the selection of system modifications to enhance safety. Within a PRA, the Human Reliability Analysis (HRA) addresses the potential failure events related to human interactions with the technical system. The scope of the human-system interactions addressed in HRA includes human actions to operate, maintain, and to respond to abnormal conditions and emergencies. This chapter presents the scientific basis for HRA and shows the relationship of HRA to the broader field of human factors and ergonomics. Additionally, this chapter provides an overview of HRA methods, the associated empirical data, and the structured analysis process used to identify the key human failure events (HFEs) relevant for safety, to characterize the conditions and factors that affect their occurrence, and to estimate their probabilities. The insights gained from the HRA process and its integration into PRA can be used to improve human performance as well as to identify modifications to hardware, equipment, and automation, with the overall aim of improving safety. Besides comprehensive HRA applications within PRA to evaluate the risk of facilities as a whole, HRA is also used in the design process to evaluate new or modified systems, functions, automation, and human-system interfaces. As a discipline, HRA applies risk and reliability techniques in order to evaluate the impact of human performance on the overall system. This includes human actions whose reliability has been improved by human factors, ergonomics, and human factors engineering; as well as actions within automated systems. The intent of this chapter is to provide the general reader a high-level understanding of how the human element is treated in risk assessments, to provide human factors experts an outline of how their science is transformed into probabilities and integrated in PRA, and to provide PRA practitioners a synopsis of science, empirical data, and judgment blended in the HRA process and outputs. The chapter closes with a summary of recent developments to further advance HRA and overall conclusions on the state of practice.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Some methods use the term PSF and other methods use other terms such as performance influencing factors (PIFs), error producing conditions (EPCs) or other such names. The terms are used interchangeably in this document.

  2. 2.

    For the purpose of making a distinction, human factors processes (i.e., theory, principles, data and methods) that do not incorporate HRA can be referred to as “classical human factors” and human factors processes that do incorporate HRA can be referred to as “risk-informed human factors.”

  3. 3.

    Other complex technologies are intended to include aircraft, spacecraft, railways, military systems, etc. The term “facilities” is used throughout this chapter to refer to all of these.

  4. 4.

    In the rest of this section, this is referred to as the “human factors configuration” or just “configuration.”

  5. 5.

    A hard switch refers to a mechanical device that an operator manipulates by causing a physical change of position while in contact with the device. A soft switch refers to a graphical object on a touch screen that performs a similar function without a physical change of position.

  6. 6.

    While it is clear from the discussion that HRA practitioners are a type of human factors engineer, in this chapter we will use the terminology “human factors engineer” to mean practitioners of classical human factors and identify HRA practitioners specifically when that is what is meant.

  7. 7.

    By “semi-quantitative” application, we mean that frequencies or probabilities are calculated or estimated for the failures identified in these deductive processes such that risk ranking can be determined for the human failures of interest.

References

  1. Garrick BJ (1984) Recent case studies and advancements in probabilistic risk assessment. Risk Anal 4(4):267–279

    Article  Google Scholar 

  2. International Atomic Energy Agency (2016) Safety assessment for facilities and activities, IAEA Safety Standards Series No. GSR Part 4 (Rev 1). IAEA, Vienna

    Google Scholar 

  3. Kirwan B (1994) A guide to practical human reliability assessment. CRC Press

    Google Scholar 

  4. Human Factors and Ergonomics Society (2020) What-is-human-factors-and-ergonomics page. https://www.hfes.org/About-HFES/What-is-Human-Factors-and-Ergonomics. Last accessed 28 Dec 2020

  5. Boring R (2007) Meeting human reliability requirements through human factors. In: Vinnem JE, Aven T (eds) Risk, reliability and societal safety: proceedings of the european safety and reliability conference 2007 (ESREL 2007). CRC Press, Stavanger, Norway

    Google Scholar 

  6. Taylor C (2017) Integrating human reliability analysis and human factors engineering for risk-informed plant design and improvement. In: Proceedings of the PSAM topical conference on human reliability, quantitative human factors, and risk management, Munich, Germany

    Google Scholar 

  7. Wickens CD, Gordon SE, Liu Y (1997) An introduction to human factors engineering. Addison Wesley Longman Inc, New York

    Google Scholar 

  8. National Research Council (1990) Quantitative modeling of human performance in complex dynamic systems. Panel on human performance modeling, committee on human factors. Commission on Behavioral and Social Sciences and Education. National Academy Press, Washington, DC, USA

    Google Scholar 

  9. Amico PJ, Young J (1992) Probabilistic analysis in the design of advanced NPPs: human factors issues. In: Oka Y, Koshizuka S (eds) Proceedings of the international conference on design and safety of advanced nuclear power plants. Atomic Energy Society of Japan, Tokyo, Japan

    Google Scholar 

  10. BSC (2008) Attachment E, Human reliability analysis. In: 060-PSA-CR00-00200-000-00A, Canister receipt and closure facility reliability and event sequence categorization analysis, Bechtel-SAIC Corporation, Las Vegas, NV, USA

    Google Scholar 

  11. SAIC (2002) Chapter 8 and Appendix F Human reliability analysis. In: SAIC-00/2641, Umatilla chemical agent disposal facility quantitative risk assessment. Science Applications International Corporation, Abingdon, MD, USA

    Google Scholar 

  12. Swain AD, Guttman HE (1983) Handbook of human reliability analysis with emphasis on nuclear power plant applications (THERP). NUREG/CR-1278-F, U.S. Nuclear Regulatory Commission, Washington, DC, USA

    Google Scholar 

  13. Swain AD (1987) Accident sequence evaluation program human reliability analysis procedure. NUREG/CR-4772/SAND86-1996, Sandia National Laboratories for the U.S. Nuclear Regulatory Commission, Washington, DC, USA

    Google Scholar 

  14. Spurgin AJ (1990) Operator reliability experiments using power plant simulators, vol 2. EPRI NP-69373, Electric Power Research Institute, Palo Alto, CA, USA

    Google Scholar 

  15. Williams JC (1986) HEART—a proposed method for assessing and reducing human error. In: Proceedings of the 9th advances in reliability technology symposium, University of Bradford, UK, 2–4 Apr 1986, pp B3/R/1-B/3/R/13

    Google Scholar 

  16. Kirwan B (1997) The development of a nuclear chemical plant human reliability management approach: HRMS and JHEDI. Reliab Eng Syst Saf 56(2):107–133

    Article  Google Scholar 

  17. Embrey DE, Humphreys P, Rosa EA, Kirwan B, Rea K (1984) SLIM-MAUD: an approach to assessing human error probabilities using structured expert judgment. Volume I: Overview of SLIM-MAUD. NUREG/CR-3518, U.S. Nuclear Regulatory Commission, Washington, DC, USA

    Google Scholar 

  18. Embrey DE, Humphreys P, Rosa EA, Kirwan B, Rea K (1984) SLIM-MAUD: an approach to assessing human error probabilities using structured expert judgment. Volume II: Detailed analysis of the technical issues. NUREG/CR-3518, U.S. Nuclear Regulatory Commission, Washington, DC, USA

    Google Scholar 

  19. Division of Risk Analysis and Applications (2000) Office of nuclear regulatory research: technical basis and implementation guidelines for a technique for human event analysis (ATHEANA). NUREG-1624. U.S. Nuclear Regulatory Commission, Washington, DC, USA

    Google Scholar 

  20. Forester J, Kolaczkowski A, Cooper S, Bley D, Lois E (2007) ATHEANA user’s guide. NUREG-1880. U.S. Nuclear Regulatory Commission, Washington, DC, USA

    Google Scholar 

  21. Hollnagel E (1998) Cognitive reliability and error analysis method. Elsevier

    Google Scholar 

  22. Bieder C, Le Bot P, Desmares E, Cara F, Bonnet JL (1998) MERMOS: EDF’s new advanced HRA method. In: Probabilistic safety assessment and management, PSAM4, NY, USA

    Google Scholar 

  23. Le Bot P, Bieder C, Cara F (1999) MERMOS, a second generation HRA method: what it does and doesn’t do. In: International topical meeting on probabilistic safety assessment, PSA’99, Washington, DC, USA

    Google Scholar 

  24. CAHR homepage. http://www.cahr.de/tools/CAHR.htm. Last accessed 31 Dec 2020

  25. Mercurio D (2010) Discrete dynamic event tree modeling and analysis of nuclear power plant crews for safety assessment, Doctoral Thesis No. 19321, ETHZ

    Google Scholar 

  26. Cacciabue PC (1998) Modeling and simulation of human behavior for safety analysis and control of complex systems. Saf Sci 28:97–110

    Article  Google Scholar 

  27. Cacciabue PC (1997) A methodology of human factors analysis for systems engineering: theory and applications. IEEE Trans Syst Man Cybern 27(3):325–339

    Article  Google Scholar 

  28. Mosleh A, Chang YH (2004) Model-based human reliability analysis: prospects and requirements. Reliab Eng Syst Saf 83(2):241–253

    Article  Google Scholar 

  29. Park J, Dang VN (2013) Performance factors for the analysis of crew responses to nuclear power plant simulated emergencies. In: Proceedings of the European Society for Reliability Annual Meeting (ESREL 2013), Amsterdam, Netherlands, pp 579–584

    Google Scholar 

  30. Meister D (1964) Methods of predicting human reliability in man-machine systems. Hum Factors: J Hum Factors Ergon Soc

    Google Scholar 

  31. Swain AD (1969) Overview and status of human factors reliability analysis. In: Proceedings of the eighth reliability and maintainability conference. American Institute of Aeronautics and Astronautics, Denver, CO, USA

    Google Scholar 

  32. Liao H, Groth K, Stevens-Adams S (2015) Challenges in leveraging existing human performance data for quantifying the IDHEAS HRA method. Reliab Eng Syst Saf 144:159–169

    Article  Google Scholar 

  33. NEA/OECD (2012) Simulator data for HRA purposes. In: Workshop Processing, Budapest, Hungary, November 4–6 (2009). NEA.CSNI/R(2012)1. NEA/OECD, Paris

    Google Scholar 

  34. Bye A, Laumann K, Braarud PØ, Massaiu S (2006) Methodology for improving HRA by simulator studies. In: Proceedings of the international conference on Probabilistic Safety Assessment and Management (PSAM 8). New Orleans, LA, USA

    Google Scholar 

  35. Choi SY, Jung W (2013) Qualitative human event analysis with simulator data by using HuRAM+ and HERA. In: Proceedings of the European Society for Reliability annual meeting (ESREL 2013). Amsterdam, Netherland

    Google Scholar 

  36. Munger SJ, Smith RW, Payne D (1962) An index of electronic equipment operability: data store. American Institutes for Research, AIR-C43-1/62-RP(1), Pittsburg, PA, USA

    Google Scholar 

  37. Irwin I, Levitz JJ, Freed AM (1964) Human reliability in the performance of maintenance. Report LRP 317/TDR-63-218. Aerojet-General Corporation. Sacramento, CA, USA

    Google Scholar 

  38. Meister D (1967) Tables for predicting the operational performance of personnel. Appendix A in Hornyak JJ. Effectiveness of display subsystem measurement and prediction techniques, RADC-TR-67–292. Rome Air Development Center, Griffiss Air Force Base, NY, USA

    Google Scholar 

  39. Williams JC (1988) A Data-based method for assessing and reducing human error to improve operational performance. In: Proceedings of the 1988 IEEE fourth conference on human factors and power plants. Institute of Electrical and Electronic Engineers, Monterey, CA, USA

    Google Scholar 

  40. Kramer JJ (1970) Human factors problems in the use of push button telephones for data entry. In: Proceedings of the forth international symposium on human factors in telephony. Berlin, Germany, pp 241–258

    Google Scholar 

  41. Elkin EH (1959) Effect of scale shape, exposure time and display complexity on scale reading efficiency. USAF Wright Air Defense Center. Technical Report 58-472

    Google Scholar 

  42. Kirwan B, Basra G, Taylor-Adams S (1997) CORE-DATA: a computerised human error database for human reliability support. In: Human factors and power plants, Proceedings of the 1997 IEEE sixth conference on global perspectives of human factors in power generation

    Google Scholar 

  43. Kirwan B, Gibson H (2007) CARA: a human reliability assessment tool for air traffic safety management—technical basis and preliminary architecture. In: The safety of systems proceedings of the fifteenth safety-critical systems symposium. Bristol, UK, pp 197–214

    Google Scholar 

  44. Kirwan B, Gibson H, Kennedy R, Edmunds J, Cooksley G, Umbers I (2005) Nuclear Action Reliability Assessment (NARA): a data-based HRA tool. Saf Reliab 25(2):38–45

    Article  Google Scholar 

  45. Kirwan B, Gibson WG, Edmunds J, Kennedy RJ (2011) Technical basis for NARA, A method for human error quantification. Corporate Risk Associates Ltd. (CRA), CRA-BEGL-POW-J032 Report 1(7)

    Google Scholar 

  46. Edmunds J, Kennedy RJ, Kirwan B, Gibson WG (2011) A user manual for the Nuclear Action Reliability Assessment (NARA) human error quantification technique. CRA-BEGL-POW-J032 Report 2(8)

    Google Scholar 

  47. Gibson W, Mills A, Smith S, Kirwan B (2013) Railway action reliability assessment, A railway-specific approach to human error quantification, In: 4th international conference on rail human factors. London, UK, pp 671–676

    Google Scholar 

  48. Parry GW, Lydell BO (1992) An approach to the analysis of operator actions in probabilistic risk assessment. EPRI TR-100259, Palo Alto, CA, USA

    Google Scholar 

  49. Xing J, Parry G, Presley M, Forester J, Hendrickson S, Dang V (2017) An Integrated Human Event Analysis System (IDHEAS) for nuclear power plant internal events at-power application, vol 1. NUREG-2199, U.S. Nuclear Regulatory Commission, Washington, DC, USA

    Google Scholar 

  50. NEA (2008) HRA data and recommended actions to support the collection and exchange of HRA data. NEA/CSNI/R(2008)9, 3

    Google Scholar 

  51. Chang JY, Bley D, Criscione L, Kirwan B, Mosleh A, Madary T, Nowell R, Richards R, Roth EM, Sieben S, Zoulis A (2014) The SACADA database for human reliability and human performance. Reliab Eng Syst Saf 125:117–133

    Article  Google Scholar 

  52. Jung W, Park J, Kim Y, Choi SY, Kim S (2020) HuREX—a framework of HRA data collection from simulators in nuclear power plants. Reliab Eng Syst Safety 194

    Google Scholar 

  53. U.S. NRC Data webpage. https://www.nrc.gov/data/index.html. Last accessed 31 Dec 2020

  54. Reer B (2008) Review of advances in human reliability analysis of errors of commission—Part 1: EOC identification. Reliab Eng Syst Saf 93(8):1081–1104

    Google Scholar 

  55. Julius JA, Jorgenson EJ, Parry GW, Mosleh A (1995) A procedure for the analysis of errors of commission in a probabilistic safety assessment of a nuclear power plant at full power. Reliab Eng Syst Saf 50:189–201

    Google Scholar 

  56. Reer B, Dang VN, Hirschberg S (2004) The CESA method and its application in a plant-specific pilot study on errors of commission. Reliab Eng Syst Saf 83(2):187–205

    Article  Google Scholar 

  57. Podofillini L, Dang VN, Nusbaumer O, Dres D (2013) A pilot study for errors of commission for a boiling water reactor using the CESA method. Reliab Eng Syst Saf 109:86–98

    Article  Google Scholar 

  58. Kim Y, Kim J (2015) Identification of human-induced initiating events in the low power and shutdown operation using the commission error search and assessment method. Nucl Eng Technol 47(2):187–195

    Article  Google Scholar 

  59. Taylor C (2015) Improving scenario analysis for HRA: handbook of good practices. HWR-1145, OECD Halden Reactor Project

    Google Scholar 

  60. Forester J, Dang VN, Bye A, Lois E, Massaiu S, Bromberg H, Braarud PØ, Boring RL, Männistö I, Liao H, Julius J, Parry G, Nelson P (2014) The international HRA empirical study lessons learned from comparing HRA methods predictions to HAMMLAB simulator data. NUREG-2127, U.S. Nuclear Regulatory Commission, Washington, DC, USA

    Google Scholar 

  61. Forester J, Liao H, Dang VN, Bye A, Lois E, Presley M, Marble J, Nowell R, Broberg H, Hildebrandt M, Hallbert B, Morgan T (2016) The US HRA empirical study—assessment of HRA method predictions against operating crew performance on a US Nuclear Power Plant Simulator. NUREG-2156, U.S. Nuclear Regulatory Commission, Washington, DC, USA

    Google Scholar 

  62. U.S. Nuclear Regulatory Commission (2012) Fire human reliability analysis guidelines–final report. NUREG-1921/EPRI 1023001, Electric Power Research Institute (EPRI), Palo Alto, CA and U.S. Nuclear Regulatory Commission, Washington, DC, USA

    Google Scholar 

  63. Pyy P, Himanen R (1996) A praxis oriented approach for plant specific human reliability analysis—finnish experience from Olkiluoto NPP. In: Proceedings of the probabilistic safety and management ’96 (ESREL’96 – PSAM-III). Crete (Greece), Springer, London Ltd, pp 882–887

    Google Scholar 

  64. Holmberg J, Pyy P (2000) An expert judgement based method for human reliability analysis of Forsmark 1 and II probabilistic safety assessment. In: Kondo S, Furuta K (eds) Proceedings of the 5th international conference on Probabilistic Safety Analysis and Management (PSAM5), vol 2/4. Universal Academy Press, Tokyo, pp 797–802

    Google Scholar 

  65. CSNI (2004) Human reliability analysis in probabilistic safety assessment for nuclear power plants. Topical Opinion Paper, Committee on the Safety of Nuclear Installations (CSNI), Nuclear Energy Agency

    Google Scholar 

  66. Reer B (2008) Review of advances in human reliability analysis of errors of commission—Part 2: EOC quantification. Reliab Eng Syst Saf 93(8):1105–1122

    Article  Google Scholar 

  67. Reer B (2009) Outline of a method for quantifying errors of commission. LEA 09-302, Report by the Laboratory for Energy Systems Analysis (LEA), Paul Scherrer Institute, Villigen PSI, Switzerland

    Google Scholar 

  68. Molseh A (2014) PSA: a perspective on strengths, current limitations, and possible improvements. Nucl Eng Technol 46:1–10

    Article  Google Scholar 

  69. Groth KM, Mosleh A (2012) A data-informed PIF hierarchy for model-mased human reliability analysis. Reliab Eng Syst Saf 108:154–174

    Article  Google Scholar 

  70. Whaley AM, Xing J, Boring RL, Hendrickson SML, Joe JC, Le Blanc KL, Morrow SL (2016) Cognitive basis for human reliability analysis. NUREG-2114, U.S. Nuclear Regulatory Commission, Washington, DC, USA

    Google Scholar 

  71. Ekanem NJ, Mosleh A, Shen S-H (2015) Phoenix—a model-based human reliability analysis methodology: qualitative analysis procedure: reliability engineering and system safety 145:301–315

    Google Scholar 

  72. Groth KM, Mosleh A (2012) Deriving causal Bayesian networks from human reliability analysis data: a methodology and example mode. Proc Inst Mech Eng, Part O: J Risk Reliab 226(4):361–379

    Google Scholar 

  73. Mkrtchyan L, Podofillini L, Dang VN (2015) Dang: Bayesian belief networks for human reliability analysis: a review of applications and gaps. Reliab Eng Syst Saf 139:1–16

    Article  Google Scholar 

  74. Porthin M, Liinasuo M, Kling T (2020) Effects of digitalization of nuclear power plant control rooms on human reliability analysis—a review. Reliab Eng Syst Saf 194

    Google Scholar 

  75. Xing J, Chang YJ, DeJesus J (2021) The general methodology of an integrated human event analysis system (IDHEAS-G), NUREG-2198, U.S. Nuclear Regulatory Commission, Washington, DC, USA

    Google Scholar 

  76. Johanson G, Jonsson S, Bladh K, Iseland T, Karlsson K-H, Karlsson A, Ljungbjörk J, Becker G, Tunturivouri L, Porthin M, Olsson A, Böhm J (2015) Evaluation of existing applications and guidance on methods for HRA–EXAM-HRA. In: A practical guide to HRA Nordic PSA Group (NPSAG) Report 11-004-02

    Google Scholar 

  77. Electric Power Research Institute (EPRI) (2016) An approach to human reliability analysis for external events with a focus on seismic. Technical Report, ID 3002008093, Palo Alto, CA, USA

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Huafei Liao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Liao, H. et al. (2023). Quantitative Perspectives for Human Performance and Risk. In: Duffy, V.G., Lehto, M., Yih, Y., Proctor, R.W. (eds) Human-Automation Interaction. Automation, Collaboration, & E-Services, vol 10. Springer, Cham. https://doi.org/10.1007/978-3-031-10780-1_13

Download citation

Publish with us

Policies and ethics