Skip to main content

Causes of Risk Information Concealment

  • Chapter
  • 2015 Accesses

Abstract

In what follows, we identify and synthesize the 30 main factors that compelled organizations and personnel, in previously mentioned cases, to hide risks. This is based on the observation that people have clearly acted in similar ways to conceal risks in different disaster situations, across a very broad range of industry and contexts.

There is not a crime, there is not a dodge,

there is not a trick, there is not a swindle,

there is not a vice which does not live by secrecy

Joseph Pulitzer

That men do not learn very much from the lessons of history

is the most important of all the lessons that history has to teach

Aldous Huxley

Only a fool learns from his own mistakes. The wise man learns from the mistakes of others

Otto von Bismarck

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    Scott D. Sagan, The Limits of Safety. Princeton University Press (1995).

  2. 2.

    Eric Schlosser, Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety, Penguin Books; Reprint edition (August 26, 2014).

  3. 3.

    James Reason, Human error: models and management. BMJ 320, 768–770 (2000).

  4. 4.

    Tatyana Kovalenko, Didier Sornette, Dynamical Diagnosis and Solutions for Resilient Natural and Social Systems, Planet@ Risk 1 (1), 7–33 (2013) Davos, Global Risk Forum (GRF) Davos, http://arxiv.org/abs/1211.1949.

  5. 5.

    Tyler Atkinson, David Luttrell and Harvey Rosenblum, How Bad Was It? The Costs and Consequences of the 2007–2009 Financial Crisis, Staff Papers is published by the Federal Reserve Bank of Dallas, No. 20, 1–22, July 2013.

  6. 6.

    Exceptions include the Energy-related Severe Accident Database (ENSAD) (see http://www.psi.ch/ta/risk-assessment) at the Paul Scherrer Institute, ETH, Switzerland. ENSAD currently comprises 32705 accident records. Of these 83.2 % are classified as man-made, 16.3 % as natural disasters, and 0.5 % as conflicts. Among man-made accidents, 20245 are attributable to the energy sector, and of these 93.8 % occurred in the years 1970–2008 (see Burgherr, P. and Hirschberg, S., Comparative risk assessment of severe accidents in the energy sector, Energy Policy http://dx.doi.org/10.1016/j.enpol.2014.01.035i, 2014). Another example is the Energy Infrastructure Attack Database (EIAD), a large dataset that uniquely categorizes reported incidents where non-state actors target energy infrastructure. It was jointly developed by the Crisis & Risk Network (CRN) research group at the Center for Security Studies (CSS) at ETH Zurich and the Technology Assessment (TA) Group of the Laboratory for Energy Systems (LEA) at the Paul Scherrer Institute (PSI). See Center for Security Studies (CSS), ETH Zurich (2012). Energy Infrastructure Attack Database (EIAD). http://www.css.ethz.ch/research/research_projects/index/EIAD.

  7. 7.

    Valery Legasov, Interview to Ales Adamovich, Record from cassette #5, 1986–1988.

  8. 8.

    25 years of the Chernobyl accident (1986–2011). Results and Prospects overcoming its consequences in Russia, Ministry for Civil Defense, Emergencies and Disaster Management of the Russian Federation, Moscow, 2011, p. 20.

  9. 9.

    Fukushima Accident, World Nuclear Association, Updated February 2015, http://www.world-nuclear.org/info/Safety-and-Security/Safety-of-Plants/Fukushima-Accident/.

  10. 10.

    Nancy G. Leveson, Engineering a Safer World: Systems Thinking Applied to Safety, MIT Press, 2011, pp. 56–57, 428, 432.

  11. 11.

    http://www.crsp.com.

  12. 12.

    Adams, John, Risk. Routledge, London/New York (1995).

  13. 13.

    Nancy G. Leveson, MIT, Technical and Managerial Factors in the NASA Challenger and Columbia Losses: Looking Forward to the Future, published within Kleinman, Cloud-Hansen, Matta, and Handelsman, Controveries in Science and Technology Vol. 2, Mary Ann Liebert Press, 2008.

  14. 14.

    Nancy G. Leveson, Engineering a Safer World: Systems Thinking Applied to Safety, MIT Press, 2011, p. 419, 423.

  15. 15.

    Nancy G. Leveson, Engineering a Safer World: Systems Thinking Applied to Safety, MIT Press, 2011, pp. 44, 425.

  16. 16.

    Report of the President’s Commission on the Accident at Three Mile Island: The Need for Change: The Legacy of TMI, October 1979, pp. 103.

  17. 17.

    Technical Crisis Center of Russian Safety Institute of Atomic Energy Sciences. http://www.youtube.com/watch?v=7Vlj9vmGPBE.

  18. 18.

    Herbert Simon, “Designing Organizations for an Information-Rich World”, in: Martin Greenberger: Computers, Communication, and the Public Interest. Baltimore, MD: The Johns Hopkins Press (1971) pp. 40–41.

  19. 19.

    D. Sornette, Dragon-Kings, Black Swans and the Prediction of Crises, International Journal of Terraspace Science and Engineering 2(1), 1–18 (2009) (e-print at http://arXiv.org/abs/0907.4290).

  20. 20.

    D. Sornette and G. Ouillon, Dragon-kings: mechanisms, statistical methods and empirical evidence, Eur. Phys. J. Special Topics 205, 1–26 (2012) (special issue on power laws and dragon-kings) (e-print at http://arxiv.org/abs/1205.1002).

  21. 21.

    D. Sornette and P. Cauwels, A Creepy World: How can managers spot and manage systemic crises, Journal of Risk Management in Financial Institutions (JRMFI) 8 (1), (2015) http://ssrn.com/abstract=2388739).

  22. 22.

    http://en.wikipedia.org/wiki/Near_miss_(safety).

  23. 23.

    T.W. van der Schaaf (Editor) Near Miss Reporting as a Safety Tool. Butterworth-Heinemann (Jan. 1, 1991).

  24. 24.

    W.G. Bridges, Gains from Getting Near Misses Reported, Presentation at 8th Global Congress on Process Safety, Houston, TX April 1–4, 2012 (Process Improvement Institute, Inc., 2012).

  25. 25.

    Bennett, S.A., Human Errorby Design? Basingstoke: Palgrave-Macmillan (2001).

  26. 26.

    Reason, J., Managing the Risks of Organizational Accidents. Ashgate Publishing Company, Abingdon, Oxfordshire, 1 edition (Dec. 1, 1997).

  27. 27.

    Reason, J. and A. Hobbs, Managing Maintenance Error: A Practical Guide, Ashgate Publishing Limited, Abingdon, Oxfordshire (May 1, 2003).

  28. 28.

    With information concerning criticism, controversies and negative incidents in 14 languages related to 44,000 companies, 10,000 projects, 7000 NGOs and 6000 governmental bodies at the time of writing, according RepRisk sources.

  29. 29.

    Tatyana Kovalenko and Didier Sornette, Dynamical Diagnosis and Solutions for Resilient Natural and Social Systems. Planet@ Risk 1(1), 7–33 (2013), Global Risk Forum (GRF) Davos (http://arxiv.org/abs/1211.1949).

  30. 30.

    Jared Diamond and James A. Robinson (eds.) Natural Experiments of History, Belknap Press (2011).

  31. 31.

    C. Perrow, Normal Accidents: Living with High-Risk Technologies, 2nd edn. Princeton University Press, Princeton, NJ (1999).

  32. 32.

    John Seddon, Freedom from Command and Control: A Better Way to Make the Work Work. Vanguard Consulting (2003).

  33. 33.

    John Seddon, Systems Thinking in the Public Sector: The Failure of the Reform Regime… and a Manifesto for a Better Way. Triarchy Press (2008).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dmitry Chernov .

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Chernov, D., Sornette, D. (2016). Causes of Risk Information Concealment. In: Man-made Catastrophes and Risk Information Concealment. Springer, Cham. https://doi.org/10.1007/978-3-319-24301-6_3

Download citation

Publish with us

Policies and ethics