Skip to main content
Log in

Risk Calculation as Experience and Action—Assessing and Managing the Risks and Opportunities of Nanomaterials

  • Original Paper
  • Published:
NanoEthics Aims and scope Submit manuscript

Abstract

Discussions about the appropriate way of assessing and managing new or emerging technologies—like nanomaterials—expose the problematic relationship between scientific knowledge production and regulatory decision-making. On one hand, there is a strong demand for scientific expertise to support decisions, especially by analyzing risks and hazards when uncertainties are prevalent and society’s stakes are high. On the other hand, science is criticized for its authoritative claim to objectivity and for keeping the inherent uncertainty, ambiguity, and selectivity of scientific observation latent. Requests for more transparency in science can lead to revealing, to risk managers and the public, the indeterminacy in knowledge production processes. This has consequences for the prevalence of scientific knowledge in decision-making, because it increases uncertainty on both sides of the breach between science and decisions: scientists lose confidence regarding the scientifically tested knowledge which they pass on, and risk managers lose confidence regarding their decisions based on this knowledge. Nonetheless, the concept of “probabilistic risk assessment” remains an important heuristic for dealing with potential future events. This paper addresses questions of the function of scientific risk assessment in organized risk management. The main argument in this paper is that knowledge alone no longer functions as a mechanism for absorbing uncertainty. Accordingly, the interaction between science and decisions must enable a temporarily stable commitment to manage new threats like products and applications coming from the field of nanoscience and nanomaterials.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Luhmann used this expression in a similar fashion with regard to the mass media: “Whatever we know about our society, or indeed about the world in which we live, we know through the mass media (…) we know so much about the mass media that we are not able to trust these sources” [2, p. 1].

  2. One example can be found in the NRC report [3]; another example (for an excessively complicated model) can be found in the International Risk Governance Council report [10]. For an overview, see Jahnel [11] in this special section.

  3. The concept of “truth” refers to the theory of “symbolically generalized communication media,” elaborated by Luhmann [12]. This concept cannot be laid out in detail in this paper, so the discussion is limited to the distinct modes of attribution. The specific function, differentiation, and self-validation of the symbolically generalized communication media of truth is discussed in “Die Wissenschaft der Gesellschaft” [The Science of Society] [13]. Unfortunately, the book has not been translated yet.

  4. Other experts concur in this evaluation [17, 18].

  5. One of the most prominent attempts was conducted by Chauncey Starr [21].

  6. See, for example, Zwick and Renn [28]. Following this fashion of more information, the findings of risk perception research have, above all, been incorporated into the disciplines of natural science in hazard research [29], e.g., in many of the contributions in this anthology [30].

  7. Theories on risk, consequently, develop concepts or analyze heuristics which enable the reduction of complexity of future events, i.e., the limitation of expectable events. Considering different disciplines and fields of application, we can identify various approaches to deal with the problem of risk: probabilistic measurements, deterministic methods, and psycho-metric research on individual risk perception. For an overview, see [38].

  8. See Hansen and Baun [42] in this special section.

  9. Pielke’s statement [35, p. 64] that good decisions reduce uncertainty is tautological: The condition for certainty is the absence of unknown (and possibly negative) consequences; yet such absence simultaneously relieves from the necessity to make a precarious decision in the first place.

  10. The mark consists of a vertical line that separates two sides and a horizontal line that points to one side and not the other [48].

  11. For example, the IceCube Neutrino Observatory, where neutrino research is conducted while the particle detector is buried in Antarctic ice. Statements about findings are communicated in statistical terms. The crucial distinction in the cited research is the observation of ‘extraterrestrial’ events, as opposed to events of atmospheric origin [55].

  12. Thus, the observation creates the object observed. Comparable statements are also made in physics (indeterminacy principle) and mathematics (observer theory [56]).

  13. As a result of the scientification of politics, the latter cannot be conducted on ideological grounds alone. Scientific advice serves the purpose of displaying actionability within politics, but with reference to reasoning from outside of politics [60, p. 258].

  14. The precautionary principle is a striking example of problems with operationalizing hypothetic knowledge for legal action: “The [European] Court of Justice and Court of First Instance, as well as the EFTA Courts, reply to this [speculative health risks] is that a preventative measure cannot properly be based on a purely hypothetical consideration of the risk, founded on mere conjecture that has not been scientifically verified. It follows that there must exist a threshold of scientific plausibility” [67].

  15. Examples of other social systems are science, economy, and law.

  16. A topic which cannot be discussed in depth in this paper.

  17. That is true in qualitative and quantitative regard: to regulate earlier and more often, because of the precautionary principle, and to act on the basis of fragile knowledge: “where there is uncertainty as to the existence or extent of risks to human health, protective measures may be taken without having to wait until the reality and seriousness of those risks become fully apparent” [67, p. 142].

  18. For a thorough discussion, see Jahnel [11] in this special section.

  19. Our translation of “Unsicherheitsabsorption ist ein Entscheidungsprozess.”

References

  1. Barber B (1987) Trust in science. Minerva 25:123–134. doi:10.1007/BF01096860

    Article  Google Scholar 

  2. Luhmann N (2000) The reality of the mass media. Stanford University Press, Stanford

    Google Scholar 

  3. NRC (2009) Science and decisions. Advancing risk assessment. National Academies Press, Washington, D.C

    Google Scholar 

  4. Jahnel J (2015) Addressing the challenges to the risk assessment of nanomaterials. In: Dolez PI (ed) Nanoengineering: Global approaches to health and safety issues. Elsevier, Amsterdam, Boston and Heidelberg, pp 485–521

  5. Miller G, Wickson F (2015) Risk analysis of nanomaterials: exposing nanotechnology’s naked emperor. Rev Policy Res 32:485–512. doi:10.1111/ropr.12129

    Article  Google Scholar 

  6. Wynne B (2001) Creating public alienation: expert cultures of risk and ethics on GMOs. Sci Cult 10:445–481

    Article  Google Scholar 

  7. Jasanoff S (2003) Technologies of humility: citizen participation in governing science. Minerva 41:223–244

    Article  Google Scholar 

  8. SCHER, SCENIHR, SCCS (2013) Making risk assessment more relevant for risk management. scientific committee on consumer safety; scientific committee on health and environmental risks; scientific committee on emerging and newly identified health risks. European Commission, Brussels

  9. Stirling A (2008) “Opening up” and “closing down” power, participation, and pluralism in the social appraisal of technology. Sci Technol Hum Values 33:262–294. doi:10.1177/0162243907311265

    Article  Google Scholar 

  10. IRCG (2006) White paper on nanotechnology risk governance. International Risk Governance Council, Geneva

    Google Scholar 

  11. Jahnel J (2015) Conceptual questions and challenges associated with the traditional risk assessment paradigm for nanomaterials. NanoEthics 9(3). doi:10.1007/s11569-015-0235-0

  12. Luhmann N (2012) Theory of society - volume 1. Stanford University Press, Stanford

    Google Scholar 

  13. Luhmann N (1994) Die Wissenschaft der Gesellschaft, 2nd edn. Suhrkamp, Frankfurt am Main

  14. Fleischer T, Jahnel J, Seitz S (2012) NanoSafety. Risk governance of manufactured nanoparticles. STOA, Brussels

    Google Scholar 

  15. Groves C (2009) Nanotechnology, contingency and finitude. NanoEthics 3:1–16. doi:10.1007/s11569-009-0057-z

    Article  Google Scholar 

  16. Rocks S, Pollard S, Dorey R et al (2008) Comparison of risk assessment approaches for manufactured nanomaterials. Defra, London

    Google Scholar 

  17. Efsa SC (2011) Guidance on the risk assessment of the application of nanoscience and nanotechnologies in the food and feed chain. EFSA J 9:1–36. doi:10.2903/j.efsa.2011.2140

    Google Scholar 

  18. SCENIHR (2007) Opinion on the appropriateness of the risk assessment methodology in accordance with the technical guidance docum ents for new and existing substances for assessing the risk of nanomaterials. European Commission, Brussels

    Google Scholar 

  19. Robinson LA, Levy DI (2011) The [r]evolving relationship between risk assessment and risk management. Risk Anal 31:1334–1344

  20. OECD (2003) Descriptions of selected key generic terms used in chemical hazard/risk assessment. Organisation for Economic Co-operation and Development (OECD), Paris

  21. Starr C (1969) Social benefit versus technological risk. What is our society willing to pay for safety? Science 165:1232–1238

    Article  Google Scholar 

  22. Kahneman D, Tversky A (1982) Subjective probability - a judgement of representativeness. In: Kahneman D (ed) Judgment under uncertainty: Heuristics and biases. Cambridge University Press, Cambridge, pp 32–47

  23. Jungermann H, Slovic P (1993) Die Psychologie der Kognition und Evaluation von Risiko. In: Bechmann G (ed) Risiko und Gesellschaft. Grundlagen und Ergebnisse interdisziplinärer Risikoforschung. Westdt Verl, Opladen, pp 167–207

  24. Gigerenzer G, Gaissmaier W (2011) Heuristic decision making. Annu Rev Psychol 62:451–482. doi:10.1146/annurev-psych-120709-145346

    Article  Google Scholar 

  25. Rowe WD (1977) An anatomy of risk. Wiley, New York

    Google Scholar 

  26. Renn O (2008) Concepts of risk: an interdisciplinary review - part 2: integrative approaches. GAIA 17:196–204

    Google Scholar 

  27. Jasanoff S (1998) The political science of risk perception. Reliab Eng Syst Saf 59:91–99. doi:10.1016/S0951-8320(97)00129-4

    Article  Google Scholar 

  28. Zwick MM, Renn O (2008) Risikokonzepte jenseits von Eintrittswahrscheinlichkeit und Schadenserwartung. In: Felgentreff C, Glade T (eds) Naturrisiken und Sozialkatastrophen. Spektrum, Berlin, pp 77–97

  29. Merz B, Emmermann R (2006) Zum Umgang mit Naturgefahren in Deutschland: Vom Reagieren zum Risikomanagement. GAIA 15:265–274

  30. Felgentreff C, Glade T (2008) Naturrisiken und Sozialkatastrophen. Spektrum, Berlin et al.

  31. Wynne B (1995) Technology assessment and reflexive learning: Observations from the risk field. In: Rip A, Misa TJ, Schot J (eds) Managing technology in society: The approach of constructive technology assessment. Pinter Publishers, London, pp 19–36

  32. Wynne B (1996) May the sheep safely graze? - A reflexive view of the expert-lay knowledge divide. In: Lash S, Szerszynski B, Wynne B (eds) Risk, environment and modernity: Towards a new ecology. SAGE, London, Thousands Oaks and New Delhi, pp 44–83

  33. Wynne B (2005) Risk as globalizing “democratic” discourse? Framing subjects and citizens. In: Leach M, Scoones I, Wynne B (eds) Science and citizens: Globalization and the challenge of engagement. Zed Books, London, pp 66–82

  34. Luhmann N (1995) Social systems. Stanford University Press, Stanford

    Google Scholar 

  35. Pielke RA (2007) The honest broker. Making sense of science in policy and politics. Cambridge University Press, Cambridge

    Book  Google Scholar 

  36. Japp KP (1992) Selbstverstärkungseffekte riskanter Entscheidungen. Zur Unterscheidung von Rationalität und Risiko. ZfS 21:31–48

  37. Luhmann N (2005) Risk - a sociological theory. Aldine Transactions, New Brunswick (USA)

    Google Scholar 

  38. Büscher C, Mascareño A (2014) Mechanisms of risk production in modern cities. Nat Cult 9:66–86. doi:10.3167/nc.2014.090104

    Article  Google Scholar 

  39. Knight FH (1921) Risk, uncertainty and profit. Houghton Mifflin, http://www.econlib.org/library/Knight/knRUP.html

  40. March JG (1994) A primer on decision making: how decisions happen. Maxwell Macmillan International, New York

    Google Scholar 

  41. Luhmann N (1993) Die Paradoxie der Form. In: Baecker D (ed) Kalkül der Form. Suhrkamp, Frankfurt am Main, pp 197–212

  42. Hansen SF, Baun A (2015) DPSIR- and stakeholder analysis of the use of nanosilver. NanoEthics 9(2)

  43. MacCrimmon KR, Wehrung DA, Stanbury WT (1986) Taking risks: the management of uncertainty. Free Press, New York

    Google Scholar 

  44. Elster J (1994) Rationality, emotions, and social norms. Synthese 98:21–49. doi:10.1007/BF01064024

    Article  Google Scholar 

  45. Brunsson N (2000) The irrational organization: irrationality as a basis for organizational action and change, 2nd edn. Fagbokforlaget Vigmostad Bjørke, Bergen

    Google Scholar 

  46. Brunsson N (2007) The consequences of decision-making. Oxford University Press, Oxford

    Google Scholar 

  47. March JG, Simon H (1993) Organizations, 2nd edn. Blackwell Publishers, Cambridge

    Google Scholar 

  48. Spencer-Brown G (1997) Laws of Form. Gesetze der Form. Bohmeier, Lübeck

  49. Merton RK (1973) The normative structure of science. In: Storer NW (ed) The sociology of science: Theoretical and empirical investigations. University of Chicago Press, Chicago, pp 267–278

  50. Luhmann N (2006) System as difference. Organization 13:37–57. doi:10.1177/1350508406059638

    Article  Google Scholar 

  51. Malle BF (1999) How people explain behavior: a New theoretical framework. Personal Soc Psychol Rev 3:23–48. doi:10.1207/s15327957pspr0301_2

    Article  Google Scholar 

  52. Parsons T, Platt GM (1973) The American university. Harvard University Press, Cambridge

    Book  Google Scholar 

  53. Pinch T (1985) Towards an analysis of scientific observation: the externality and evidential significance of observational reports in physics. Soc Stud Sci 15:3–36. doi:10.1177/030631285015001001

    Article  Google Scholar 

  54. Douglas H (2004) The irreducible complexity of objectivity. Synthese 138:453–473. doi:10.1023/B:SYNT.0000016451.18182.91

    Article  Google Scholar 

  55. IceCube Collaboration (2013) Evidence for high-energy extraterrestrial neutrinos at the IceCube detector. Science 342:1–7. doi:10.1126/science.1242856

    Article  Google Scholar 

  56. Spencer-Brown G (1996) Wahrscheinlichkeit und Wissenschaft. Carl-Auer-Systeme, Heidelberg

  57. Weiss C (2003) Expressing scientific uncertainty. Law Probab Risk 2:25–46. doi:10.1093/lpr/2.1.25

    Article  Google Scholar 

  58. von Weizsäcker CF (1985) Aufbau der Physik. Carl Hanser Verlag, München

    Google Scholar 

  59. Lindley DV (2000) The philosophy of statistics. J R Stat Soc Ser Stat 49:293

    Article  Google Scholar 

  60. Kusche I (2008) Politikberatung und die Herstellung von Entscheidungssicherheit im politischen System. VS Verl. für Sozialwiss, Wiesbaden

  61. Klaine SJ, Koelmans AA, Horne N et al (2012) Paradigms to assess the environmental impact of manufactured nanomaterials. Environ Toxicol Chem 31:3–14. doi:10.1002/etc.733

    Article  Google Scholar 

  62. JRC (2011) REACH implementation project: substance identification of nano materials (RIP - oN 1) - Advisory report. European Commission Joint Research Centre Institute for Health and Consumer Protection

  63. Renn O, Grobe A (2010) Risk governance in the field of nanotechnologies: Core challanges of an integrative approach. In: Hodge GA, Bowman DM, Maynard AD (eds) International handbook on regulating nanotechnologies. Edward Elgar Publishing, Cheltenham, UK/ University of Michigan, USA, pp 484–507

  64. Luhmann N (1996) On the scientific context of the concept of communication. Soc Sci Inf 35:257–267. doi:10.1177/053901896035002005

    Article  Google Scholar 

  65. Meili C, Widmer M (2010) Voluntary measures in nanotechnology risk governance: The difficulty of holding the wolf by the ears. In: Hodge GA, Bowman DM, Maynard AD (eds) International handbook on regulating nanotechnologies. Edward Elgar Publishing, Cheltenham, UK/ University of Michigan, USA, pp 446–461

  66. Krug HF (2014) Nanosafety research—Are we on the right track? Angew Chem Int Ed. doi:10.1002/anie.201403367

    Google Scholar 

  67. de Sadeleer N (2006) The precautionary principle in EC health and environmental Law. Eur Law J 12:139–172. doi:10.1111/j.1468-0386.2006.00313.x

    Article  Google Scholar 

  68. Widmer M, Meili C (2010) Approaching the nanoregulation problem in chemical legislation in the EU and US. In: Hodge GA, Bowman DM, Maynard AD (eds) International handbook on regulating nanotechnologies. Edward Elgar Publishing, Cheltenham, UK/ University of Michigan, USA, pp 239–267

  69. Japp KP (2000) Distinguishing non-knowledge. Can J Sociol 25:225–238

    Article  Google Scholar 

  70. Hodge GA, Bowman DM, Maynard AD (2010) Introduction: The regulatory challenges for nanotechnologies. In: Hodge GA, Bowman DM, Maynard AD (eds) International handbook on regulating nanotechnologies. Edward Elgar Publishing, Cheltenham, UK/ University of Michigan, USA, pp 3–24

  71. Hansen SF (2013) The European Union’s chemical legislation needs revision. Nat Nanotechnol 8:305–306. doi:10.1038/nnano.2013.72

  72. Weick KE, Sutcliffe KM, Obstfeld D (2005) Organizing and the process of sensemaking. Organ Sci 16:409–421. doi:10.1287/orsc.1050.0133

    Article  Google Scholar 

  73. Luhmann N (2010) Politische Soziologie. Suhrkamp, Berlin

  74. Millstone E (2010) The evolution of risk assessment paradigms: in theory and in practice. Sussex, England

    Google Scholar 

  75. Codex Alimentarius Commission (2007) Codex alimentarius. World Health Organization, Food and Agriculture Organization of the United Nations, Rome

    Google Scholar 

  76. Luhmann N (2000) Organisation und Entscheidung. Westdt Verl, Opladen

Download references

Acknowledgments

I would like to express my gratitude toward the editor and the reviewer for their helpful comments to improve this paper. Also, I would like to thank my colleagues from the Institute of Technology Assessment and Systems Analysis for their contributions in countless discussions. Especially, Jutta Jahnel helped me sharpen the arguments. For her patience and support with linguistic and stylistic issues, I would like to thank Mira Klemm. Any remaining issues are my responsibility.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christian Büscher.

Ethics declarations

Conflict of Interest

The author declares that he has no conflict of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Büscher, C. Risk Calculation as Experience and Action—Assessing and Managing the Risks and Opportunities of Nanomaterials. Nanoethics 9, 277–295 (2015). https://doi.org/10.1007/s11569-015-0237-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11569-015-0237-y

Keywords

Navigation