Abstract
Discussions about the appropriate way of assessing and managing new or emerging technologies—like nanomaterials—expose the problematic relationship between scientific knowledge production and regulatory decision-making. On one hand, there is a strong demand for scientific expertise to support decisions, especially by analyzing risks and hazards when uncertainties are prevalent and society’s stakes are high. On the other hand, science is criticized for its authoritative claim to objectivity and for keeping the inherent uncertainty, ambiguity, and selectivity of scientific observation latent. Requests for more transparency in science can lead to revealing, to risk managers and the public, the indeterminacy in knowledge production processes. This has consequences for the prevalence of scientific knowledge in decision-making, because it increases uncertainty on both sides of the breach between science and decisions: scientists lose confidence regarding the scientifically tested knowledge which they pass on, and risk managers lose confidence regarding their decisions based on this knowledge. Nonetheless, the concept of “probabilistic risk assessment” remains an important heuristic for dealing with potential future events. This paper addresses questions of the function of scientific risk assessment in organized risk management. The main argument in this paper is that knowledge alone no longer functions as a mechanism for absorbing uncertainty. Accordingly, the interaction between science and decisions must enable a temporarily stable commitment to manage new threats like products and applications coming from the field of nanoscience and nanomaterials.
This is a preview of subscription content, access via your institution.
Notes
Luhmann used this expression in a similar fashion with regard to the mass media: “Whatever we know about our society, or indeed about the world in which we live, we know through the mass media (…) we know so much about the mass media that we are not able to trust these sources” [2, p. 1].
The concept of “truth” refers to the theory of “symbolically generalized communication media,” elaborated by Luhmann [12]. This concept cannot be laid out in detail in this paper, so the discussion is limited to the distinct modes of attribution. The specific function, differentiation, and self-validation of the symbolically generalized communication media of truth is discussed in “Die Wissenschaft der Gesellschaft” [The Science of Society] [13]. Unfortunately, the book has not been translated yet.
One of the most prominent attempts was conducted by Chauncey Starr [21].
Theories on risk, consequently, develop concepts or analyze heuristics which enable the reduction of complexity of future events, i.e., the limitation of expectable events. Considering different disciplines and fields of application, we can identify various approaches to deal with the problem of risk: probabilistic measurements, deterministic methods, and psycho-metric research on individual risk perception. For an overview, see [38].
See Hansen and Baun [42] in this special section.
Pielke’s statement [35, p. 64] that good decisions reduce uncertainty is tautological: The condition for certainty is the absence of unknown (and possibly negative) consequences; yet such absence simultaneously relieves from the necessity to make a precarious decision in the first place.
The mark consists of a vertical line that separates two sides and a horizontal line that points to one side and not the other [48].
For example, the IceCube Neutrino Observatory, where neutrino research is conducted while the particle detector is buried in Antarctic ice. Statements about findings are communicated in statistical terms. The crucial distinction in the cited research is the observation of ‘extraterrestrial’ events, as opposed to events of atmospheric origin [55].
Thus, the observation creates the object observed. Comparable statements are also made in physics (indeterminacy principle) and mathematics (observer theory [56]).
As a result of the scientification of politics, the latter cannot be conducted on ideological grounds alone. Scientific advice serves the purpose of displaying actionability within politics, but with reference to reasoning from outside of politics [60, p. 258].
The precautionary principle is a striking example of problems with operationalizing hypothetic knowledge for legal action: “The [European] Court of Justice and Court of First Instance, as well as the EFTA Courts, reply to this [speculative health risks] is that a preventative measure cannot properly be based on a purely hypothetical consideration of the risk, founded on mere conjecture that has not been scientifically verified. It follows that there must exist a threshold of scientific plausibility” [67].
Examples of other social systems are science, economy, and law.
A topic which cannot be discussed in depth in this paper.
That is true in qualitative and quantitative regard: to regulate earlier and more often, because of the precautionary principle, and to act on the basis of fragile knowledge: “where there is uncertainty as to the existence or extent of risks to human health, protective measures may be taken without having to wait until the reality and seriousness of those risks become fully apparent” [67, p. 142].
For a thorough discussion, see Jahnel [11] in this special section.
Our translation of “Unsicherheitsabsorption ist ein Entscheidungsprozess.”
References
Barber B (1987) Trust in science. Minerva 25:123–134. doi:10.1007/BF01096860
Luhmann N (2000) The reality of the mass media. Stanford University Press, Stanford
NRC (2009) Science and decisions. Advancing risk assessment. National Academies Press, Washington, D.C
Jahnel J (2015) Addressing the challenges to the risk assessment of nanomaterials. In: Dolez PI (ed) Nanoengineering: Global approaches to health and safety issues. Elsevier, Amsterdam, Boston and Heidelberg, pp 485–521
Miller G, Wickson F (2015) Risk analysis of nanomaterials: exposing nanotechnology’s naked emperor. Rev Policy Res 32:485–512. doi:10.1111/ropr.12129
Wynne B (2001) Creating public alienation: expert cultures of risk and ethics on GMOs. Sci Cult 10:445–481
Jasanoff S (2003) Technologies of humility: citizen participation in governing science. Minerva 41:223–244
SCHER, SCENIHR, SCCS (2013) Making risk assessment more relevant for risk management. scientific committee on consumer safety; scientific committee on health and environmental risks; scientific committee on emerging and newly identified health risks. European Commission, Brussels
Stirling A (2008) “Opening up” and “closing down” power, participation, and pluralism in the social appraisal of technology. Sci Technol Hum Values 33:262–294. doi:10.1177/0162243907311265
IRCG (2006) White paper on nanotechnology risk governance. International Risk Governance Council, Geneva
Jahnel J (2015) Conceptual questions and challenges associated with the traditional risk assessment paradigm for nanomaterials. NanoEthics 9(3). doi:10.1007/s11569-015-0235-0
Luhmann N (2012) Theory of society - volume 1. Stanford University Press, Stanford
Luhmann N (1994) Die Wissenschaft der Gesellschaft, 2nd edn. Suhrkamp, Frankfurt am Main
Fleischer T, Jahnel J, Seitz S (2012) NanoSafety. Risk governance of manufactured nanoparticles. STOA, Brussels
Groves C (2009) Nanotechnology, contingency and finitude. NanoEthics 3:1–16. doi:10.1007/s11569-009-0057-z
Rocks S, Pollard S, Dorey R et al (2008) Comparison of risk assessment approaches for manufactured nanomaterials. Defra, London
Efsa SC (2011) Guidance on the risk assessment of the application of nanoscience and nanotechnologies in the food and feed chain. EFSA J 9:1–36. doi:10.2903/j.efsa.2011.2140
SCENIHR (2007) Opinion on the appropriateness of the risk assessment methodology in accordance with the technical guidance docum ents for new and existing substances for assessing the risk of nanomaterials. European Commission, Brussels
Robinson LA, Levy DI (2011) The [r]evolving relationship between risk assessment and risk management. Risk Anal 31:1334–1344
OECD (2003) Descriptions of selected key generic terms used in chemical hazard/risk assessment. Organisation for Economic Co-operation and Development (OECD), Paris
Starr C (1969) Social benefit versus technological risk. What is our society willing to pay for safety? Science 165:1232–1238
Kahneman D, Tversky A (1982) Subjective probability - a judgement of representativeness. In: Kahneman D (ed) Judgment under uncertainty: Heuristics and biases. Cambridge University Press, Cambridge, pp 32–47
Jungermann H, Slovic P (1993) Die Psychologie der Kognition und Evaluation von Risiko. In: Bechmann G (ed) Risiko und Gesellschaft. Grundlagen und Ergebnisse interdisziplinärer Risikoforschung. Westdt Verl, Opladen, pp 167–207
Gigerenzer G, Gaissmaier W (2011) Heuristic decision making. Annu Rev Psychol 62:451–482. doi:10.1146/annurev-psych-120709-145346
Rowe WD (1977) An anatomy of risk. Wiley, New York
Renn O (2008) Concepts of risk: an interdisciplinary review - part 2: integrative approaches. GAIA 17:196–204
Jasanoff S (1998) The political science of risk perception. Reliab Eng Syst Saf 59:91–99. doi:10.1016/S0951-8320(97)00129-4
Zwick MM, Renn O (2008) Risikokonzepte jenseits von Eintrittswahrscheinlichkeit und Schadenserwartung. In: Felgentreff C, Glade T (eds) Naturrisiken und Sozialkatastrophen. Spektrum, Berlin, pp 77–97
Merz B, Emmermann R (2006) Zum Umgang mit Naturgefahren in Deutschland: Vom Reagieren zum Risikomanagement. GAIA 15:265–274
Felgentreff C, Glade T (2008) Naturrisiken und Sozialkatastrophen. Spektrum, Berlin et al.
Wynne B (1995) Technology assessment and reflexive learning: Observations from the risk field. In: Rip A, Misa TJ, Schot J (eds) Managing technology in society: The approach of constructive technology assessment. Pinter Publishers, London, pp 19–36
Wynne B (1996) May the sheep safely graze? - A reflexive view of the expert-lay knowledge divide. In: Lash S, Szerszynski B, Wynne B (eds) Risk, environment and modernity: Towards a new ecology. SAGE, London, Thousands Oaks and New Delhi, pp 44–83
Wynne B (2005) Risk as globalizing “democratic” discourse? Framing subjects and citizens. In: Leach M, Scoones I, Wynne B (eds) Science and citizens: Globalization and the challenge of engagement. Zed Books, London, pp 66–82
Luhmann N (1995) Social systems. Stanford University Press, Stanford
Pielke RA (2007) The honest broker. Making sense of science in policy and politics. Cambridge University Press, Cambridge
Japp KP (1992) Selbstverstärkungseffekte riskanter Entscheidungen. Zur Unterscheidung von Rationalität und Risiko. ZfS 21:31–48
Luhmann N (2005) Risk - a sociological theory. Aldine Transactions, New Brunswick (USA)
Büscher C, Mascareño A (2014) Mechanisms of risk production in modern cities. Nat Cult 9:66–86. doi:10.3167/nc.2014.090104
Knight FH (1921) Risk, uncertainty and profit. Houghton Mifflin, http://www.econlib.org/library/Knight/knRUP.html
March JG (1994) A primer on decision making: how decisions happen. Maxwell Macmillan International, New York
Luhmann N (1993) Die Paradoxie der Form. In: Baecker D (ed) Kalkül der Form. Suhrkamp, Frankfurt am Main, pp 197–212
Hansen SF, Baun A (2015) DPSIR- and stakeholder analysis of the use of nanosilver. NanoEthics 9(2)
MacCrimmon KR, Wehrung DA, Stanbury WT (1986) Taking risks: the management of uncertainty. Free Press, New York
Elster J (1994) Rationality, emotions, and social norms. Synthese 98:21–49. doi:10.1007/BF01064024
Brunsson N (2000) The irrational organization: irrationality as a basis for organizational action and change, 2nd edn. Fagbokforlaget Vigmostad Bjørke, Bergen
Brunsson N (2007) The consequences of decision-making. Oxford University Press, Oxford
March JG, Simon H (1993) Organizations, 2nd edn. Blackwell Publishers, Cambridge
Spencer-Brown G (1997) Laws of Form. Gesetze der Form. Bohmeier, Lübeck
Merton RK (1973) The normative structure of science. In: Storer NW (ed) The sociology of science: Theoretical and empirical investigations. University of Chicago Press, Chicago, pp 267–278
Luhmann N (2006) System as difference. Organization 13:37–57. doi:10.1177/1350508406059638
Malle BF (1999) How people explain behavior: a New theoretical framework. Personal Soc Psychol Rev 3:23–48. doi:10.1207/s15327957pspr0301_2
Parsons T, Platt GM (1973) The American university. Harvard University Press, Cambridge
Pinch T (1985) Towards an analysis of scientific observation: the externality and evidential significance of observational reports in physics. Soc Stud Sci 15:3–36. doi:10.1177/030631285015001001
Douglas H (2004) The irreducible complexity of objectivity. Synthese 138:453–473. doi:10.1023/B:SYNT.0000016451.18182.91
IceCube Collaboration (2013) Evidence for high-energy extraterrestrial neutrinos at the IceCube detector. Science 342:1–7. doi:10.1126/science.1242856
Spencer-Brown G (1996) Wahrscheinlichkeit und Wissenschaft. Carl-Auer-Systeme, Heidelberg
Weiss C (2003) Expressing scientific uncertainty. Law Probab Risk 2:25–46. doi:10.1093/lpr/2.1.25
von Weizsäcker CF (1985) Aufbau der Physik. Carl Hanser Verlag, München
Lindley DV (2000) The philosophy of statistics. J R Stat Soc Ser Stat 49:293
Kusche I (2008) Politikberatung und die Herstellung von Entscheidungssicherheit im politischen System. VS Verl. für Sozialwiss, Wiesbaden
Klaine SJ, Koelmans AA, Horne N et al (2012) Paradigms to assess the environmental impact of manufactured nanomaterials. Environ Toxicol Chem 31:3–14. doi:10.1002/etc.733
JRC (2011) REACH implementation project: substance identification of nano materials (RIP - oN 1) - Advisory report. European Commission Joint Research Centre Institute for Health and Consumer Protection
Renn O, Grobe A (2010) Risk governance in the field of nanotechnologies: Core challanges of an integrative approach. In: Hodge GA, Bowman DM, Maynard AD (eds) International handbook on regulating nanotechnologies. Edward Elgar Publishing, Cheltenham, UK/ University of Michigan, USA, pp 484–507
Luhmann N (1996) On the scientific context of the concept of communication. Soc Sci Inf 35:257–267. doi:10.1177/053901896035002005
Meili C, Widmer M (2010) Voluntary measures in nanotechnology risk governance: The difficulty of holding the wolf by the ears. In: Hodge GA, Bowman DM, Maynard AD (eds) International handbook on regulating nanotechnologies. Edward Elgar Publishing, Cheltenham, UK/ University of Michigan, USA, pp 446–461
Krug HF (2014) Nanosafety research—Are we on the right track? Angew Chem Int Ed. doi:10.1002/anie.201403367
de Sadeleer N (2006) The precautionary principle in EC health and environmental Law. Eur Law J 12:139–172. doi:10.1111/j.1468-0386.2006.00313.x
Widmer M, Meili C (2010) Approaching the nanoregulation problem in chemical legislation in the EU and US. In: Hodge GA, Bowman DM, Maynard AD (eds) International handbook on regulating nanotechnologies. Edward Elgar Publishing, Cheltenham, UK/ University of Michigan, USA, pp 239–267
Japp KP (2000) Distinguishing non-knowledge. Can J Sociol 25:225–238
Hodge GA, Bowman DM, Maynard AD (2010) Introduction: The regulatory challenges for nanotechnologies. In: Hodge GA, Bowman DM, Maynard AD (eds) International handbook on regulating nanotechnologies. Edward Elgar Publishing, Cheltenham, UK/ University of Michigan, USA, pp 3–24
Hansen SF (2013) The European Union’s chemical legislation needs revision. Nat Nanotechnol 8:305–306. doi:10.1038/nnano.2013.72
Weick KE, Sutcliffe KM, Obstfeld D (2005) Organizing and the process of sensemaking. Organ Sci 16:409–421. doi:10.1287/orsc.1050.0133
Luhmann N (2010) Politische Soziologie. Suhrkamp, Berlin
Millstone E (2010) The evolution of risk assessment paradigms: in theory and in practice. Sussex, England
Codex Alimentarius Commission (2007) Codex alimentarius. World Health Organization, Food and Agriculture Organization of the United Nations, Rome
Luhmann N (2000) Organisation und Entscheidung. Westdt Verl, Opladen
Acknowledgments
I would like to express my gratitude toward the editor and the reviewer for their helpful comments to improve this paper. Also, I would like to thank my colleagues from the Institute of Technology Assessment and Systems Analysis for their contributions in countless discussions. Especially, Jutta Jahnel helped me sharpen the arguments. For her patience and support with linguistic and stylistic issues, I would like to thank Mira Klemm. Any remaining issues are my responsibility.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The author declares that he has no conflict of interest.
Rights and permissions
About this article
Cite this article
Büscher, C. Risk Calculation as Experience and Action—Assessing and Managing the Risks and Opportunities of Nanomaterials. Nanoethics 9, 277–295 (2015). https://doi.org/10.1007/s11569-015-0237-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11569-015-0237-y
Keywords
- Risk assessment
- Risk management
- Nanotechnology
- Decision-making
- Uncertainty absorption
- Attribution theory