Skip to main content

Advertisement

Log in

How to conceive of science for the benefit of society: prospects of responsible research and innovation

  • S.I. : Responsible Research and Innovation
  • Published:
Synthese Aims and scope Submit manuscript

Abstract

Responsible research and innovation (RRI) features the dialog of science “with society,” and research performed “for society,” i.e., for the benefit of the people. I focus on this latter, outcome-oriented notion of RRI and discuss two kinds of problems. The first one concerns options to anticipate the future course of science and technology. Such foresight knowledge seems necessary for subjecting research to demands of social and moral responsibility. However, predicting science and technology is widely considered impossible. The second problem concerns moral evaluation. The benefit or harm produced by certain research and innovation achievements is often hard to estimate. Against this background of uncertainty in factual and moral assessment, I explore opportunities left for RRI. First, RRI should contribute to maintaining a wide range of approaches. Second, judgments about RRI should draw on zones of convergence among the variety of research approaches pursued. Third, decisions about implementing a technology should be revisable. Fourth, the more specific inclusion of demands from society should be reserved to technology development. Fifth, in many respects, the social compatibility of a new technology is due rather to the social context than the inherent features of the product. Favorable circumstances are transparency, representation of all relevant parties, and procedural fairness. As a result, some of the benefits and detriments of research and innovation can be identified without detailed knowledge of future findings.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. See Lengwiler (2008) for an overview of various such schemes, their historical changes, and the pertinent literature. See Stilgoe et al. (2013) and Fautz et al. (2016) for considering participatory formats. Carrier and Gartzlaff (forthcoming) provide a review of the more recent literature on participatory RRI and present the results of a survey on attitudes on RRI in the scientific community.

  2. It might be worth emphasizing that the validity of any such prediction about future paths of technology is dependent on human action. Such predictions can always be self-destroying in the sense that people refuse to accept the relevant technology. The choices left to society and politics for shaping the future can certainly not be infringed by projections of technology development (Nordmann 2007, pp. 31–34; Nordmann 2014, pp. 90–01; Stilgoe et al. 2013, p. 1571). In a similar vein, scientific and technological projections may be thwarted by social features such as carelessness or abuse (Weckert and Moor 2007, pp. 136–138).

  3. David Resnik (2017, pp. 66–73) advocates the use of the inductive-risk scheme for dealing with the legitimacy of publishing the full H5N1 virus strain (see above). However, his extensive discussion of the various factors relevant for such a decision fails to arrive at a recommendation. This inconclusive ending confirms the limited usefulness of the inductive-risk scheme for coming to terms with complex moral challenges arising from scientific research.

  4. It is worth noting that the adequate use of the precautionary principle demands in both versions some foresight knowledge. The minimum knowledge required concerns possible side-effects. In the trade-off interpretation the demands are even higher. Thus, the invocation of the precautionary principle remains within the consequentialist approach to RRI (see Sect. 2).

  5. Some authors complain that real processes of this sort are characterized by information asymmetry, power imbalance and lack of transparency (Blok and Lemmens 2015, pp. 23–24). Yet, this only makes the challenge more urgent to work out procedural schemes that promote these value commitments.

References

  • Biddle, J. (2007). Lessons from the Vioxx debacle: What the privatization of science can teach us about social epistemology. Social Epistemology, 21, 21–39.

    Article  Google Scholar 

  • Blok, V., & Lemmens, P. (2015). The emerging concept of responsible innovation. Three reasons why it is questionable and calls for a radical transformation of the concept of innovation. In B.-J. Koops (Ed.), Responsible innovation 2: Concepts, approaches, and applications (pp. 19–35). Berlin: Springer.

    Google Scholar 

  • Carrier, M. (2010). Theories for use: On the bearing of basic science on practical problems. In M. Suárez, et al. (Eds.), EPSA epistemology and methodology of science: Launch of the European philosophy of science association (pp. 23–34). Dordrecht: Springer.

    Google Scholar 

  • Carrier, M. (2011). ‘Knowledge is power’, or how to capture the relations between science and technoscience. In A. Nordmann, H. Radder, & G. Schiemann (Eds.), Science transformed? Debating claims of an epochal break (pp. 43–53). Pittsburgh: University of Pittsburgh Press.

    Chapter  Google Scholar 

  • Carrier, M. (2017). Facing the credibility crisis of science: On the ambivalent role of pluralism in establishing relevance and reliability. Perspectives on Science, 25, 439–464.

    Article  Google Scholar 

  • Carrier, M., & Gartzlaff, M. (forthcoming). Responsible research and innovation: Hopes and fears in the scientific community in Europe. Under review.

  • Collingridge, D. (1980). The social control of technology. London: Frances Pinter.

    Google Scholar 

  • de Vries, M. J. (2011). Science in the context of industrial application: The case of the Philips Natuurkundig Laboratorium. In M. Carrier & A. Nordmann (Eds.), Science in the context of application. Methodological change, conceptual transformation, cultural reorientation (pp. 47–66). Dordrecht: Springer.

    Google Scholar 

  • Douglas, H. (2000). Inductive risk and values. Philosophy of Science, 67, 559–579.

    Article  Google Scholar 

  • Douglas, H. (2009). Science, policy, and the value-free ideal. Pittsburgh: University of Pittsburgh Press.

    Book  Google Scholar 

  • Elliott, K. C. (2013). Douglas on values: From indirect roles to multiple roles. Studies in History and Philosophy of Science, 44, 375–383.

    Article  Google Scholar 

  • Enserink, M. (2013). Flu researcher Ron Fouchier loses legal fight over H5N1 studies. Science (September 25). http://www.sciencemag.org/news/2013/09/flu-researcher-ron-fouchier-loses-legal-fight-over-h5n1-studies. Accessed 7 March 2017.

  • Epstein, S. (1995). The construction of lay expertise: AIDS activism and the forging of credibility in the reform of clinical trials. Science, Technology and Human Values, 20, 408–437.

    Article  Google Scholar 

  • Fautz, C., Böschen, S., Hahn, J., Hennen, L., & Jahnel, J. (2016). Erfolgsbedingungen der Öffentlichkeitseinbindung in unterschiedlichen Innovationssettings. Technikfolgenabschätzung—Theorie und Praxis, 25, 73–77.

    Article  Google Scholar 

  • Freud, S. (1920). General introduction to psychoanalysis. New York: Boni and Liveright.

    Book  Google Scholar 

  • Grinbaum, A., & Groves, C. (2013). What is ‘responsible’ about responsible innovation? Understanding the ethical issues. In R. Owen, J. Bessant, & M. Heintz (Eds.), Responsible innovation (pp. 119–142). Chichester: Wiley.

    Chapter  Google Scholar 

  • Grunwald, A. (2014). The hermeneutic side of responsible research and innovation. Journal of Responsible Innovation, 1, 274–291.

    Article  Google Scholar 

  • Hansson, S. O. (2011). Coping with the unpredictable effects of future technologies. Philosophy & Technology, 24, 137–149.

    Article  Google Scholar 

  • Hughes, T. P. (1987). The evolution of large technological systems. In W. E. Bijker, T. P. Hughes, & T. Pinch (Eds.), The social construction of technological systems: New directions in the sociology and history of technology (pp. 51–81). Cambridge: MIT Press.

    Google Scholar 

  • Kitcher, P. (2001). Science, truth, democracy. Oxford: Oxford University Press.

    Book  Google Scholar 

  • Kitcher, P. (2011). Science in a democratic society. Amherst: Prometheus.

    Book  Google Scholar 

  • Koertge, N. (2000). Science, values, and the value of science. Philosophy of Science, 67, S45–S57.

    Article  Google Scholar 

  • Kourany, J. (2019). Should some knowledge be forbidden? The case of cognitive differences research. In J. Kourany & M. Carrier (Eds.), When the quest for knowledge is thwarted. Science and the production of ignorance. Cambridge Mass: MIT Press.

    Google Scholar 

  • Lacey, H. (2013). Rehabilitating neutrality. Philosophical Studies, 163, 77–83.

    Article  Google Scholar 

  • Lacey, H. (2016). Science, respect for nature, and human well-being: Democratic values and the responsibilities of scientists today. Foundation of Science, 21, 51–67.

    Google Scholar 

  • Lengwiler, M. (2008). Participatory approaches in science and technology. Historical origins and current practices in critical perspective. Science, Technology and Human Values, 33, 186–200.

    Article  Google Scholar 

  • Macnaghten, P. (2016). Responsible innovation and the reshaping of existing technological trajectories: The hard case of genetically modified crops. Journal of Responsible Innovation, 3, 282–289.

    Article  Google Scholar 

  • Magnus, D. (2008). Risk management versus the precautionary principle. Agnotology as a strategy in the debate over genetically engineered organisms. In R. N. Proctor & L. Schiebinger (Eds.), Agnotology. The making and unmaking of ignorance (pp. 250–265). Palo Alto: Stanford University Press.

    Google Scholar 

  • McCright, A. M., et al. (2013). The influence of political ideology on trust in science. Environmental Research Letters, 8, 1–9.

    Article  Google Scholar 

  • Merton, R. K. (1973). The normative structure of science. In R. K. Merton (Ed.), The sociology of science. Theoretical and empirical investigations (pp. 267–278). Chicago: University of Chicago Press.

    Google Scholar 

  • Michaels, D. (2008). Manufactured uncertainty. Contested science and the protection of the public’s health and environment. In R. N. Proctor & L. Schiebinger (Eds.), Agnotology. The making and unmaking of ignorance (pp. 90–107). Palo Alto: Stanford University Press.

    Google Scholar 

  • Monyer, H., et al. (2004). Das Manifest. Elf führende Neurowissenschaftler über Gegenwart und Zukunft der Hirnforschung. Gehirn und Geist, 06/2004, 30–37.

    Google Scholar 

  • Nordmann, A. (2007). If and then: A critique of speculative nanoethics. Nanoethics, 1, 31–46.

    Article  Google Scholar 

  • Nordmann, A. (2014). Responsible innovation, the art and craft of anticipation. Journal of Responsible Innovation, 1, 87–98.

    Article  Google Scholar 

  • Pew Research Center (2015). Cell phones in Africa: Communication lifeline. http://www.pewglobal.org/files/2015/04/Pew-Research-Center-Africa-Cell-Phone-Report-FINAL-April-15-2015.pdf. Accessed 14 Feb 2017.

  • Proelss, A. (2010). International environmental law and the challenge of climate change. German Yearbook of International Law, 53, 65–87.

    Google Scholar 

  • Resnik, D. B. (1998). The ethics of science. An introduction. London: Routledge.

    Google Scholar 

  • Resnik, D. B. (2017). Dual-use research and inductive risk. In K. C. Elliott & T. Richards (Eds.), Exploring inductive risk. Case studies of values in science (pp. 59–77). Oxford: Oxford University Press.

    Google Scholar 

  • Resnik, D. B., & Elliott, K. C. (2016). The ethical challenges of socially responsible science. Accountability in Research Policies and Quality Assurance, 23, 31–46.

    Google Scholar 

  • Sandin, P. (1999). Dimensions of the precautionary principle. Human and Ecological Risk Assessment, 5, 889–907.

    Article  Google Scholar 

  • Stilgoe, J., Owen, R., & Macnaghten, P. (2013). Developing a framework for responsible innovation. Research Policy, 42, 1568–1580.

    Article  Google Scholar 

  • Stokes, D. E. (1997). Pasteur’s quadrant. Basic science and technological innovation. Washington, DC: Brookings Institution Press.

    Google Scholar 

  • Unesco (2005). The precautionary principle. http://unesdoc.unesco.org/images/0013/001395/139578e.pdf. Accessed 30 Aug 2018.

  • von Schomberg, R. (2013). A vision of responsible innovation. In R. Owen, M. Heintz, & J. Bessant (Eds.), Responsible innovation: Managing the responsible innovation of science and innovation in society (pp. 51–74). London: Wiley.

    Chapter  Google Scholar 

  • Weckert, J., & Moor, J. (2007). The precautionary principle in nanotechnology. In F. Allhoff, P. Lin, J. Moor, & J. Weckert (Eds.), Nanoethics: The ethical and social implications of nanotechnology (pp. 133–146). Hoboken: Wiley.

    Google Scholar 

  • Wick, R. R., et al. (2008). Increased risk of myeloid leukaemia in patients with ankylosing spondylitis following treatment with radium-224. Rheumatology, 47, 855–859.

    Article  Google Scholar 

  • Wilholt, T. (2012). Die Freiheit der Forschung. Begründungen und Begrenzungen. Berlin: Suhrkamp.

    Google Scholar 

  • Willyard, C. (2018). Send in the germs. Nature, 556, 16–18.

    Article  Google Scholar 

  • Young, E. (2007). Non-invasive glucose monitoring for diabetes: Five strategies under development. The Pharmaceutical Journal. https://www.pharmaceutical-journal.com/news-and-analysis/features/non-invasive-glucose-monitoring-for-diabetes-five-strategies-under-development/20203666.article?firstPass=false. Accessed 12 Oct 2017.

Download references

Acknowledgements

Preliminary versions of the paper have been delivered at half a dozen occasions. I am grateful to the various audiences involved for their helpful questions and criticism. I am indebted, in particular, to my Bielefeld colleagues Minea Gartzlaff and Marie Kaiser for their valuable advice. The paper has grown out of the NUCLEUS Project that was supported in the Horizon 2020 program (GA-Nr.: 664932). NUCLEUS has been led by Alexander Gerber (Rhine-Waal University of Applied Sciences).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Martin Carrier.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Carrier, M. How to conceive of science for the benefit of society: prospects of responsible research and innovation. Synthese 198 (Suppl 19), 4749–4768 (2021). https://doi.org/10.1007/s11229-019-02254-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11229-019-02254-1

Keywords

Navigation