Abstract
Risk and resilience are important paradigms for analyzing and guiding decisions about uncertain threats. Resilience has sometimes been favored for threats that are unknown, unquantifiable, systemic, and unlikely/catastrophic. This paper addresses the suitability of each paradigm for such threats, finding that they are comparably suitable. Threats are rarely completely unknown or unquantifiable; what limited information is typically available enables the use of both paradigms. Either paradigm can in practice mishandle systemic or unlikely/catastrophic threats, but this is inadequate implementation of the paradigms, not inadequacy of the paradigms themselves. Three examples are described: (a) Venice in the Black Death plague, (b) artificial intelligence (AI), and (c) extraterrestrials. The Venice example suggests effectiveness for each paradigm for certain unknown, unquantifiable, systemic, and unlikely/catastrophic threats. The AI and extraterrestrials examples suggest how increasing resilience may be less effective, and reducing threat probability may be more effective, for certain threats that are significantly unknown, unquantifiable, and unlikely/catastrophic.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Adler M (2007) Why de minimis? University of Pennsylvania, Institute for Law and Economics, Research paper no. 07-12
Armstrong S, Sotala K (2012) How we’re predicting AI–or failing to. In: Ircing P, Zackova E, Polak M, Schuster R (eds) Beyond AI: artificial dreams. University of West Bohemia, Pilsen, pp 52–75
Aven T (2011) On some recent definitions and analysis frameworks for risk, vulnerability, and resilience. Risk Anal 31(4):515–522
Baum SD, Handoh IC (2014) Integrating the planetary boundaries and global catastrophic risk paradigms. Ecol Econ 107:13–21
Baum SD, Haqq-Misra JD, Domagal-Goldman SD (2011a) Would contact with extraterrestrials benefit or harm humanity? A scenario analysis. Acta Astronaut 68(11–12):2114–2129
Baum SD, Goertzel B, Goertzel TG (2011b) How long until human-level AI? Results from an expert assessment. Technol Forecast Soc Change 78(1):185–195
Baum SD, Maher TM Jr, Haqq-Misra J (2013) Double catastrophe: intermittent stratospheric geoengineering induced by societal collapse. Environ Syst Decis 33(1):168–180
Bier VM, Haimes YY, Lambert JH, Matalas NC, Zimmerman R (1999) A survey of approaches for assessing and managing the risk of extremes. Risk Anal 19(1):83–94
Bostrom N (2014) Superintelligence: paths, dangers, strategies. Oxford University Press, Oxford
Brin D (n.d.) Shouting at the cosmos: How SETI has taken a worrisome turn into dangerous territory. http://www.davidbrin.com/shouldsetitransmit.html
Ćirković MM (2004) The temporal aspect of the Drake equation and SETI. Astrobiology 4(2):225–231
Ćirković MM (2012) Small theories and large risks—is risk analysis relevant for epistemology? Risk Anal 32(11):1994–2004
Crevier D (1993) AI: The tumultuous history of the search for artificial intelligence. Basic Books, New York
Eden AH, Moor JH, Soraker JH, Steinhart E (2013) Singularity hypotheses: a scientific and philosophical assessment. Springer, Berlin
Good IJ (1965) Speculations concerning the first ultraintelligent machine. In: Alt FL, Rubinoff M (eds) Advances in computers. Academic Press, London, pp 31–88
Haimes YY (2009a) On the definition of resilience in systems. Risk Anal 29(4):498–501
Haimes YY (2009b) On the complex definition of risk: a systems-based approach. Risk Anal 29(12):1647–1654
Haqq-Misra J, Busch MW, Som SM, Baum SD (2013) The benefits and harm of transmitting into space. Space Policy 29(1):40–48
Horvitz E, Selman B (2009) Interim report from the panel chairs. AAAI presidential panel on long-term AI futures. http://www.aaai.org/Organization/Panel/panel-note.pdf
Jebari K (2014) Existential risks: exploring a robust risk reduction. Sci Eng Ethics. doi:10.1007/s11948-014-9559-3
Joshi NN, Lambert JH (2011) Diversification of engineering infrastructure investments for emergent and unknown non-systematic risks. J Risk Res 14(4):1466–4461
Joy B (2000) Why the future doesn’t need us. Wired 8(04):238–262
Kaplan S, Garrick BJ (1981) On the quantitative definition of risk. Risk Anal 1(1):11–27
Karvetski CW, Lambert JH (2012) Evaluating deep uncertainties in strategic priority-setting with an application to facility energy investments. Syst Eng 15(4):483–493
Linkov I, Eisenberg DA, Bates ME, Chang D, Convertino M, Allen JH, Flynn SE, Seager TP (2013a) Measurable resilience for actionable policy. Environ Sci Technol 47(18):10108–10110
Linkov I, Eisenberg DA, Plourde K, Seager TP, Allen J, Kott A (2013b) Resilience metrics for cyber systems. Environ Syst Decis 33(4):471–476
Linkov I, Bridges T, Creutzig F, Decker J, Fox-Lent C, Kröger W et al (2014a) Changing the resilience paradigm. Nat Clim Change 4(6):407–409
Linkov I, Fox-Lent C, Keisler J, Della Sala S, Sieweke J (2014b) Risk and resilience lessons from Venice. Environ Syst Decis 34:378–382
Maher TM Jr, Baum SD (2013) Adaptation to and recovery from global catastrophe. Sustainability 5(4):1461–1479
Matheny JG (2007) Reducing the risk of human extinction. Risk Anal 27(5):1335–1344
Michaud MAG (2007) Contact with alien civilizations: our hopes and fears about encountering extraterrestrials. Copernicus Books, New York
Müller VC, Bostrom N (forthcoming) Future progress in artificial intelligence: a poll among experts. In: Müller VC (ed) Fundamental issues of artificial intelligence. Springer, Berlin
NRC (National Research Council) (2012) Disaster resilience: a national imperative. The National Academies Press, Washington
Park J, Seager TP, Rao PSC, Convertino M, Linkov I (2013) Integrating risk and resilience approaches to catastrophe management in engineering systems. Risk Anal 33(3):356–367
Posner R (2004) Catastrophe: risk and Response. Oxford University Press, Oxford
Roege PE, Collier ZA, Mancillas J, McDonagh JA, Linkov I (2014) Metrics for energy resilience. Energy Policy 72:249–256
Tsang JL, Lambert JH, Patev RC (2002) Extreme event scenarios for planning of infrastructure projects. J Infrastruct Syst 8(2):42–48
Wallenhorst SG (1981) The Drake equation reexamined. Q J R Astron Soc 22:380–387
Webb S (2002) If the universe is teeming with aliens—where is everybody? Fifty solutions to the Fermi paradox and the problem of extraterrestrial life. Springer, New York
Weber EU (2006) Experience-based and description-based perceptions of long-term risk: why global warming does not scare us (yet). Clim Change 77(1–2):103–120
Whitten SM, Hertzler G, Strunz S (2012) How real options and ecological resilience thinking can assist in environmental risk management. J Risk Res 15(3):331–346
Yudkowsky E (2011) Complex value systems in friendly AI. In: Schmidhuber J, Thórisson KR, Looks M (eds) Artificial general intelligence: 4th international conference proceedings. Springer, Berlin, pp 388–393
Zhou Q, Lambert JH, Karvetski CW, Keisler JM, Linkov I (2012) Flood protection diversification to reduce probabilities of extreme losses. Risk Anal 32(11):1873–1887
Acknowledgments
I thank Tony Barrett, James H. Lambert, and three anonymous reviewers for helpful comments on earlier versions of this paper. Any remaining errors or other shortcomings are those of the author.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Baum, S.D. Risk and resilience for unknown, unquantifiable, systemic, and unlikely/catastrophic threats. Environ Syst Decis 35, 229–236 (2015). https://doi.org/10.1007/s10669-015-9551-8
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10669-015-9551-8