Advertisement

Cognition, Technology & Work

, Volume 15, Issue 3, pp 277–282 | Cite as

The little engine who could not: “rehabilitating” the individual in safety research

  • Sidney W. A. Dekker
  • James M. Nyce
  • Douglas J. Myers
Original Article

Abstract

Safety science is one of the enduring enlightenment projects, which believes that rationality can create a better, more controllable world. Accidents are not seen as meaningless coincidences, but as failures of risk management, as something that can be improved in the future. The tragedy of safety research is that it has to simultaneously deny and affirm the primacy of human agency. As it has gradually expanded away from the sharp end to see accidents as bureaucratic or administrative in origin, the research keeps supplying linguistic and analytic resources that focus on individual shortcomings in leadership, communication or supervision. This paper concludes that individual human agency is useful to safety work, but not just as an instrument of political or organizational expedience. It is useful because it deeply reflects and reinforces how in the West we understand failure and success. The explanatory power of this discourse is confirmed or taken for granted by safety researchers because it appears so ordinary, self-evident and commonsensical.

Keywords

Safety science Risk management Accident investigation Human error Human agency Enlightenment 

References

  1. Amalberti R (2001) The paradoxes of almost totally safe transportation systems. Saf Sci 37(2–3):109–126CrossRefGoogle Scholar
  2. Anon (2005) Murder! Mayhem! Social Order! Wilson Quarterly vol 29. Woodrow Wilson International Center for Scholars, Washington, pp 94–96Google Scholar
  3. Beck U (1992) Risk society: towards a new modernity. Sage Publications Ltd, LondonGoogle Scholar
  4. Bosk C (2003) Forgive and remember: managing medical failure. University of Chicago Press, ChicagoCrossRefGoogle Scholar
  5. CAIB (2003) Report volume 1, August 2003. Columbia Accident Investigation Board, Washington, DCGoogle Scholar
  6. Cook RI, Nemeth CP (2010) ‘‘Those found responsible have been sacked’’: some observations on the usefulness of error. Cognit Technol Work 12:87–93CrossRefGoogle Scholar
  7. Dekker SWA (2003) Illusions of explanation: a critical essay on error classification. Int J Aviat Psychol 13(2):95–107CrossRefGoogle Scholar
  8. Dekker SWA (2005) Ten questions about human error: a new view of human factors and system safety. Lawrence Erlbaum Associates, MahwahGoogle Scholar
  9. Dekker SWA (2011a) The criminalization of human error in aviation and healthcare: a review. Saf Sci 49(2):121–127CrossRefGoogle Scholar
  10. Dekker SWA (2011b) What is rational about killing a patient with an overdose? Enlightenment, continental philosophy and the role of the human subject in system failure. Ergonomics 54(8):679–683CrossRefGoogle Scholar
  11. DeVille K (2004) God, science, and history: the cultural origins of medical error. In: Sharpe VA (ed) Accountability: patient safety and policy reform. Georgetown University Press, Washington, pp 143–158Google Scholar
  12. Douglas M (1992) Risk and blame: essays in cultural theory. Routledge, LondonCrossRefGoogle Scholar
  13. Elkin F (1955) Hero symbols and audience gratifications. J Educ Sociol 29(3):97–107CrossRefGoogle Scholar
  14. Feldman SP (2004) The culture of objectivity: quantification, uncertainty, and the evaluation of risk at NASA. Human Relat 57(6):691–718CrossRefGoogle Scholar
  15. Foucault M (1975) The spectacle of the scaffold. Penguin Group, LondonGoogle Scholar
  16. Galison P (2000) An accident of history. In: Galison P, Roland A (eds) Atmospheric flight in the twentieth century. Kluwer, Dordrecht, The Netherlands pp 3–44CrossRefGoogle Scholar
  17. Gawande A (2002) Complications: a surgeon’s notes on an imperfect science. Picado, New YorkGoogle Scholar
  18. Giddens A (1991) Modernity and self-identity: self and society in the late modern age. Stanford University Press, StanfordGoogle Scholar
  19. Green J (2003) The ultimate challenge for risk technologies: controlling the accidental. In: Summerton J, Berner B (eds) Constructing risk and safety in technological practice. Routledge, LondonGoogle Scholar
  20. Hollnagel E (1998) Cognitive reliability and error analysis method: CREAM. Elsevier, OxfordGoogle Scholar
  21. Hollnagel E (2004) Barriers and accident prevention. Aldershot, UKGoogle Scholar
  22. Jensen C (1996) No downlink: a dramatic narrative about the challenger accident and our time. Farrar, Straus, Giroux, New YorkGoogle Scholar
  23. Levack BP (1987) The witch-hunt in early modern Europe. Longman, LondonGoogle Scholar
  24. Marx D (2001) Patient safety and the ‘‘just culture’’: a primer for health care executives. Columbia University, New YorkGoogle Scholar
  25. McLean B, Elkind P (2004) The smartest guys in the room: the amazing rise and scandalous fall of Enron. Portfolio, New YorkGoogle Scholar
  26. O’Neill B (2012) Concordia provides no morality tale. The Weekend Australian, Sydney, p 15Google Scholar
  27. Pellegrino ED (2004) Prevention of medical error: where professional and organizational ethics meet. In: Sharpe VA (ed) Accountability: patient safety and policy reform. Georgetown University Press, Washington, pp 83–98Google Scholar
  28. Perrow C (1984) Normal accidents: Living with high-risk technologies. Basic Books, New YorkGoogle Scholar
  29. Pronovost PJ, Vohr E (2010) Safe patients, smart hospitals. Hudson Street Press, New YorkGoogle Scholar
  30. Reason JT (1990) Human error. Cambridge University Press, New YorkCrossRefGoogle Scholar
  31. Sagan SD (1994) Toward a political theory of organizational reliability. J Contingencies Crisis Manag 2(4):228–240CrossRefGoogle Scholar
  32. Salas E, Wilson KA et al (2006) Does crew resource management training work? An update, an extension, and some critical needs. Hum Factors 48(2):392–413CrossRefGoogle Scholar
  33. Shappell SA, Wiegmann DA (2001) Applying reason: the human factors analysis and classification system. Human Factors Aerosp Saf 1:59–86Google Scholar
  34. Silbey S (2009) Taming Prometheus: talk about safety and culture. Annu Rev of Sociol 35: 341–369Google Scholar
  35. Snook SA (2000) Friendly fire: the accidental shootdown of US Black Hawks over Northern Iraq. Princeton University Press, PrincetonGoogle Scholar
  36. Turner BA (1978) Man-made disasters. Wykeham Publications, LondonGoogle Scholar
  37. Vaughan D (1996) The Challenger launch decision: risky technology, culture, and deviance at NASA. University of Chicago Press, ChicagoGoogle Scholar
  38. Vaughan D (2005) System effects: on slippery slopes, repeating negative patterns, and learning from mistake? In: Starbuck WH, Farjoun M (eds) Organization at the limit: lessons from the Columbia disaster. Blackwell Publishing, Malden, pp 41–59Google Scholar
  39. Waterson P (2009) A critical review of the systems approach within patient safety research. Ergonomics 52(10):1185–1195CrossRefGoogle Scholar
  40. Weick KE, Sutcliffe KM (2007) Managing the unexpected: Resilient performance in an age of uncertainty. Jossey-Bass, San FranciscoGoogle Scholar
  41. Young MS, Shorrock ST et al (2004) Who moved my (Swiss) cheese? The (r)evolution of human factors in transport safety investigation. International Society of Air Safety Investigators (ISASI), Gold CoastGoogle Scholar
  42. Zimbardo P (2008) The Lucifer effect: understanding how good people turn evil. Random House, New YorkGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2012

Authors and Affiliations

  • Sidney W. A. Dekker
    • 1
  • James M. Nyce
    • 2
  • Douglas J. Myers
    • 3
  1. 1.School of HumanitiesGriffith UniversityNathanAustralia
  2. 2.Department of AnthropologyBall State UniversityMuncieUSA
  3. 3.Department of Community and Family MedicineDuke UniversityDurhamUSA

Personalised recommendations