Abstract
In this contribution, we set out a framework for ethical research and innovation. Our framework draws upon recent scholarly work recommending the introduction of new models at the intersection of ethics, strategy, and science and technology studies to inform and explicate how the decisions of researchers can be considered ethical. Ethical research and innovation is construed in our framework as a dynamic process emerging from decisions of multiple stakeholders in innovation ecosystems prior to, during and after the execution of a research and innovation project. The framework can be used by different types of research organizations to implement governance models of ethical research and innovation.
Similar content being viewed by others
Notes
Moral overload results from the excessive overheads associated with rules and norms for the responsible conduct of research and innovation.
The institutional void is defined as the lack of institutions guiding the governance of research and innovation.
These are methods aiming to implement applied ethics prior to, during, and after the execution of a research and innovation project.
Human and common goods are defined as knowledge and innovation spillovers introduced to society such as the generation of new knowledge that becomes publicly available and the generation of innovations that become part of the public domain. We also include under this term intellectual properties (IPs) that may not be in the public domain but can help introduce innovation spillovers that contribute to the economy, the social advancement of humanity and the preservation of the environment.
The reader is referred to the essay of Isaiah Berlin on these two forms of freedom (Berlin 1969) and to https://plato.stanford.edu for further discussions on this subject.
Which corresponds to the overheads associated with reflecting upon and anticipating the impact of research and innovation and responding to any of their potential threats.
These are projects that pose serious ethical issues in terms of the ANERIA they entail.
Though in some cases they may engage political communication for corporate reputational effects above and beyond what is required by existing regulatory frameworks.
Such as ethical review boards and codes of responsible conduct of research.
This is reflected in the reduced number of stakeholders shown in Fig. 2.
Such as the project mentioned in “Appendix A”.
In “Appendix A”, we show how these weights can be generated.
This scale of relative importance is defined as follows: 1 (equal), 2 (moderately equal), 3 (weakly stronger), 4 (moderately stronger), 5 (stronger), 6 (stronger to much stronger), 7 (much stronger), 8 (much stronger to extremely stronger), 9 (extremely stronger). The reciprocal values correspond to the multiplicative inverse of these values.
The values in the evaluation matrix of alternatives corresponded to the value delivered by each alternative for each criterion using the following Likert scale: very unsatisfactory (1), unsatisfactory (2), neutral (3), satisfactory (4), and very satisfactory (5).
According to the plurality rule, the alternative most often ranked in the first place is the chosen alternative for the group of strategists.
References
Adam, B., & Groves, G. (2011). Futures tended: Care and future-oriented responsibility. Bulletin of Science, Technology and Society, 31, 17–27.
Adner, R. (2006). Match your innovation strategy to your innovation ecosystem. Harvard Business Review, 84, 98–107.
Adner, R., & Kapoor, R. (2016). Innovation ecosystems and the pace of substitution: Reexamining technology S-curves. Strategic Management Journal, 37, 625–648.
Ahrweiler, P., Gilbert, N., Schrempf, B., Grimpe, B., & Jirotka, M. (2019). The role of civil society organisations in European responsible research and innovation. Journal of Responsible Innovation, 6, 25–49.
Ahrweiler, P., Pyka, A., & Gilbert, N. (2011). Agency and structure: A social simulation of knowledge-intensive industries. Computational and Mathematical Organization Theory, 17, 59–76.
Alexy, O., Criscuolo, P., & Salter, A. (2009). Does IP strategy have to cripple open innovation? MIT Sloan Management Review, 51, 71–77.
Allhoff, F. (2014). The coming era of nanomedicine. In R. Sandler (Ed.), Ethics and emerging technologies (pp. 155–166). London: Palgrave Macmillan.
Anastasiadis, S., Moon, J., & Humphreys, M. (2018). Lobbying and the responsible firm: Agenda-setting for a freshly conceptualized field. Business Ethics: A European Review, 27, 207–221.
Armstrong, M., Cornut, G., Delacôte, S., Lenglet, M., Millo, Y., Muniesa, F., et al. (2012). Towards a practical approach to responsible innovation in finance: New product committees revisited. Journal of Financial Regulation and Compliance, 20, 147–168.
Barben, D., Fisher, E., Selin, C., & Guston, D. (2008). Anticipatory governance of nanotechnology: Foresight, engagement, and integration. In E. Hackett, M. Lynch, & J. Wajcman (Eds.), The Handbook of Science and Technology Studies (3rd ed., pp. 979–1000). Cambridge: MIT Press.
Baron, R., & Kenny, D. (1986). The moderator-mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51, 1173–1182.
Basl, J. (2014). What to do about artificial consciousness. In R. Sandler (Ed.), Ethics, law, and governance in the development of robots (pp. 380–392). London: Palgrave Macmillan.
Bedau, M., & Triant, M. (2014). Social and ethical implications of creating artificial cells. In R. Sandler (Ed.), Ethics, law, and governance in the development of robots (pp. 562–574). London: Palgrave Macmillan.
Benington, J., & Moore, M. (2010). From public choice to public value. In J. Benington & M. Moore (Eds.), Public value: Theory and practice (pp. 31–51). London: Palgrave Macmillan.
Bergek, A., Jacobsson, S., Carlsson, B., Lindmark, S., & Rickne, A. (2008). Analyzing the functional dynamics of technological innovation systems: A scheme of analysis. Research Policy, 37, 407–429.
Berlin, I. (1969). Two concepts of liberty. In I. Berlin (Ed.), Four essays on liberty. London: Oxford University Press.
Bessen, J. (2004). Holdup and licensing of cumulative innovations with private information. Economic Letters, 82, 321–326.
Bombard, Y., Abelson, J., Simeonov, D., & Gauvin, F. (2011). Eliciting ethical and social values in health technology assessment: A participatory approach. Social Science and Medicine, 73, 135–144.
Borda, J. (1784). Mémoire sur les élections au scrutin. Paris: Académie Royale des Sciences.
Borning, A. & Muller, M. (2012). Next steps for value sensitive design. In Proceedings of the 2012 ACM annual conference on human factors in computing systems–CHI 2012 (pp. 1125–1234). New York: Association for Computing Machinery.
Bostrom, N. (2014). Why I want to be a posthuman when I grow up. In R. Sandler (Ed.), Ethics and emerging technologies (pp. 218–234). London: Palgrave Macmillan.
Brans, J., & Mareschal, B. (1990). The PROMÉTHÉE methods for MCDM, the PROMCALC, GAIA and BANDADVISER software. In C. Bana e Costa (Ed.), Readings in multiple criteria decision aid (pp. 216–252). Berlin: Springer.
Brey, P. (2014). Virtual reality and computer simulation. In R. Sandler (Ed.), Ethics and emerging technologies (pp. 315–332). London: Palgrave Macmillan.
Cafaro, P. (2014). Avoiding catastrophic climate change: Why technological innovation is necessary but not sufficient. In R. Sandler (Ed.), Ethics, law, and governance in the development of robots (pp. 424–438). London, UK: Palgrave Macmillan.
Callon, M., Lascoumes, P., & Barthe, Y. (2009). Acting in an uncertain world: An essay on technical democracy. Cambridge: MIT Press.
Carlson, C., & Wilmot, W. (2006). The five disciplines for creating what customers want. New York: Crown Publishing Group.
Carlsson, B., & Stankiewicz, R. (1991). On the nature, function and composition of technological systems. Journal of Evolutionary Economics, 1, 93–118.
Carroll, A. (1999). A three-dimensional conceptual model of corporate performance. Academy of Management Review, 4, 497–505.
Comstock, G. (2014). Ethics and genetically modified foods. In R. Sandler (Ed.), Ethics, law, and governance in the development of robots (pp. 473–485). London: Palgrave Macmillan.
Condorcet, M. (1785). Essai sur l’application del’analyse à la probabilité des décisions rendues à la pluralité des voix. Paris: Académie Royale des Sciences.
Cotton, M. (2009). Evaluating the ethical matrix as a radioactive waste management deliberative decision-support tool. Environmental Values, 18, 153–176.
Crow, M. & Dabars, W. (2015). A new model for the American research university. Issues in Science and Technology, 31. Accesible at https://issues.org/a-new-model-for-the-american-research-university.
Douglas, T. (2014). Moral enhancement. In R. Sandler (Ed.), Ethics and emerging technologies (pp. 235–251). London: Palgrave Macmillan.
Felt, U., Fochler, M., Müller, A., & Strassnig, M. (2008). Unruly ethics: On the difficulties of a bottom-up approach to ethics in the field of genomics. Public Understanding of Science, 18, 354–371.
Figueira, J., Greco, S., & Ehrgott, M. (2005). Multiple criteria decision analysis: State of the art surveys. Berlin: Springer.
Fishburn, P. C. (1973). The theory of social choice. Princeton: Princeton University Press.
Flipse, S. M., van der Sanden, M., & Osseweijer, P. (2013). The why and how of enabling the integration of social and ethical aspects in research and development. Science and Engineering Ethics, 19, 703–725.
Freeman, R., Martin, K., & Parmar, B. (2007). Stakeholder capitalism. Journal of Business Ethics, 74, 303–314.
Fritzsche, D. (1991). A model of decision-making incorporating ethical values. Journal of Business Ethics, 10, 841–852.
Garfinkle, M., & Knowles, L. (2014). Synthetic biology, biosecurity, and biosafety. In R. Sandler (Ed.), Ethics, law, and governance in the development of robots (pp. 533–547). London: Palgrave Macmillan.
Genus, A. (2006). Rethinking constructive technology assessment as democratic, reflective, discourse. Technological Forecasting and Social Change, 73, 13–26.
Gianni, R., & Goujon, P. (2014). Analytical GRID Report: Deliverabe 2.3. GREAT Project. Accesible at https://www.great-project.eu/.
Godin, B., & Lane, J. (2008). Pushes and pulls: The Hi(S)tory of the demand pull model of innovation. Science, Technology, and Human Values, 38, 621–654.
Goldin, C., & Katz, L. (2008). The race between education and technology. Cambridge: Harvard University Press.
Gomes, L., & Lima, M. (1991). TODIM: Basics and application to multicriteria ranking of projects with environmental impacts. Foundations of Computing and Decision Sciences, 16, 113–127.
Granstrand, O., & Holgersson, M. (2013). Managing the intellectual property disassembly problem. California Management Review, 55, 148–210.
Granstrand, O., & Sjölander, M. (1990). Managing innovation in multi-technology corporations. Research Policy, 19, 35–60.
Grinbaum, A., & Groves, C. (2013). What is “responsible” about responsible innovation? Understanding the ethical issues. In R. Owen, J. Bessant, & M. Heintz (Eds.), Responsible innovation: Managing the responsible emergence of science and innovation in society (pp. 119–142). London: Wiley.
Groves, C. (2006). Technological futures and non-reciprocal responsibility. International of the Humanities, 4, 57–61.
Guston, D. (2006). Responsible innovation in the commercialized university. In D. G. Stein (Ed.), Buying in or selling out: The commercialization of the American research university (pp. 161–174). New Brunswick: Rutgers Uiversity Press.
Guston, D., & Sarewitz, D. (2002). Real-time technology assessment. Technology in Society, 24, 93–109.
Hajer, M. (2003). Policy without polity? Policy analysis and the institutional void. Policy Sciences, 36, 175–195.
Hamilton, C. (2014). Ethical anxieties about geoengineering. In R. Sandler (Ed.), Ethics, law, and governance in the development of robots (pp. 439–455). London: Palgrave Macmillan.
Hanusch, H., & Pyka, A. (2006). Comprehensive neo-schumpeterian economics and the Lisbon-Agenda: Detecting patterns of varying future-orientation in Europe. Galileu Revista de Economia e Direito, 9, 17–40.
Hanusch, H., & Pyka, A. (2007). Joseph alois schumpeter (1883–1950). In H. Hanusch & A. Pyka (Eds.), The Elgar companion on neo-schumpeterian economics (Vol. 31, pp. 19–26). Cheltenham: Edward Elgar.
Heath, J., Moriarty, J., & Norman, W. (2010). Business ethics and (or as) political philosophy. Business Ethics Quarterly, 20, 427–452.
Heidegger, M. (1977). The question concerning technology. In D.-F. Krell (Ed.), Martin Heidegger: Basic writings (pp. 287–317). New York: Harper & Row.
Hellstrom, T. (2003). Systemic innovation and risk: Technology assessment and the challenge of responsible innovation. Technology in Society, 25, 369–384.
Himma, K., & Bottis, M. (2014). The digital divide: Information technologies and the obligation to alleviate poverty. In R. Sandler (Ed.), Ethics and emerging technologies (pp. 333–346). London: Palgrave Macmillan.
Holgersson, M., Granstrand, O., & Bogers, M. (2018). The evolution of intellectual property strategy in innovation ecosystems: Uncovering complementary and substitute appropriability regimes. Long Range Planning, 51, 303–319.
Hosseini, J., & Brenner, S. (1992). The stakeholder theory of the firm: A methodology to generate value matrix weights. Business Ethics Quarterly, 2, 99–119.
Hwang, C., & Yoon, K. (1981). Multiple attribute decision making: Methods and applications. New York: Springer.
Jonas, H. (2014). Technology and responsibility: Reflections on the new tasks of ethics. In R. Sandler (Ed.), Ethics and emerging technologies (pp. 37–47). London: Palgrave Macmillan.
Keeney, R., & Raiffa, H. (1976). Decisions with multiple objectives: Preferences and value trade-offs. New York: Wiley.
Kilgour, D.-M., & Eden, C. (2010). Handbook of group decision and negotiation, advances in group decision and negotiation. Dordrecht: Springer.
Lave, R., Mirowski, P., & Randalls, S. (2010). Introduction: STS and neoliberal science. Social Studies of Science, 40, 659–675.
Lee, R. (2012). Look at mother nature on the run in the 21st century: Responsibility, research and innovation. Transnational Environmental Law, 1, 105–117.
Lenoble, J., & Maesschalk, M. (2003). Towards a theory of governance: The action of norms. The Hague: Kluwer Law International.
Lenoble, J., & Maesschalk, M. (2010). Democracy, law and governance. Farnham: Ashgate.
Lezaun, J., & Soneryd, L. (2007). Consulting citizens: Technologies of elicitation and the mobility of publics. Public Understanding of Science, 16, 279–297.
Lin, P., Abney, K., & Bekey, G. (2014). Ethics, war, and robots. In R. Sandler (Ed.), Ethics and emerging technologies (pp. 349–362). London: Palgrave Macmillan.
Maitland, I. (1997). The great non-debate over international sweatshops. In British academy of management annual conference proceedings (pp. 240–265). British Academy of Management: London.
Martin, K., & Freeman, R. (2004). The separation of technology and ethics in business ethics. Journal of Business Ethics, 53, 353–364.
McLean, I., & Urken, A. (1995). Classics of Social Choice. Ann Arbor: The University of Michigan Press.
Miller, K. (2015). Agent-based modeling and organization studies: A critical realist perspective. Organisational Studies, 36, 175–196.
Minteer, B., & Collins, J. (2014). Ecosystems unbound: Ethical questions for an interventionist ecology. In R. Sandler (Ed.), Ethics, law, and governance in the development of robots (pp. 456–469). London: Palgrave Macmillan.
Mitcham, C. (2003). Co-responsibility for research integrity. Science and Engineering Ethics, 9, 273–290.
Mitchell, R., Agle, B., & Wood, D. (1997). Toward a theory of stakeholder identification and salience: Defining the principle of who and what really counts. Academy of Management Review, 22, 853–886.
Moore, A. (2010). Beyond participation: Opening up political theory in STS. Social Studies of Science, 40, 793–799.
Munda, G. (2004). Social multi-criteria evaluation: Methodological foundations and operational consequences. European Journal of Operational Research, 158, 662–677.
Owen, R., Baxter, D., Maynard, T., & Depledge, M. (2009). Beyond regulation: Risk pricing and responsible innovation. Environmental Science and Technology, 43, 5171–5175.
Owen, R., Macnaghten, P., & Stilgoe, J. (2012). Responsible research and innovation: From science in society to science for society, with society. Science and Public Policy, 39, 751–760.
Owen, R., & Pansera, M. (2019). Responsible innovation and responsible research and innovation. In D. Simon, S. Kuhlmann, J. Stamm, & W. Canzler (Eds.), Handbook on science and public policy (pp. 26–48). London: Edward Elgar Publishing.
Paredes-Frigolett, H., Gomes, L., & Pereira, J. (2015). Governance of responsible research and innovation: An agent-based model approach. Procedia Computer Science, 55, 912–921.
Polanyi, M. (1962). The republic of science: Its political and economic theory. Minerva, 1, 54–73.
Pyka, A., Gilbert, N., & Ahrweiler, P. (2007). Simulating knowledge generation and distribution processes in innovation collaborations and networks. Cybernetics and Systems, 38, 667–693.
Radanliev, P., De Roure, D., Page, K., Nurse, J., Mantilla Montalvo, R., Santos, O., et al. (2020). Cyber risk at the edge: Current and future trends on cyber risk analytics and artificial intelligence in the industrial internet of things and industry 4.0 supply chains. Cybersecurity, 3, 1–21.
Ramírez, R., & Wilkinson, A. (2016). Strategic reframing: The Oxford scenario planning approach. Oxford: Oxford University Press.
Reich, R. (2010). Aftershock: The next economy and Americas future. New York: Knopf.
Reijers, W., Wright, D., Brey, P., Weber, K., Rodríguez, R., O’Sullivan, D., & Gordijn, B. (2018). Methods for practising ethics in research and innovation: A literature review, critical analysis and recommendations. Science and Engineering Ethics, 24, 1437–1481.
Rip, A., Misa, T., & Schot, J. (1995). Managing technology in society: The approach of constructive technology assessment. London: Thomson.
Robert, J., & Baylis, F. (2014). Crossing species boundaries. In R. Sandler (Ed.), Ethics and emerging technologies (pp. 139–154). London: Palgrave Macmillan.
Saaty, T. (1980). The analytic hierarchy process. New York: McGraw-Hill.
Salo, A., & Hämäläinen, R. (2010). Multicriteria decision analysis in group decision processes. In D. Kilgour & C. Eden (Eds.), Handbook of group decision and negotiation, advances in group decision and negotiation (pp. 269–283). Dordrecht: Springer.
Singer, A. (2013). Corporate political activity, social responsibility, and competitive strategy: An integrative model. Business Ethics: A European Review, 22, 308–324.
Singer, A., & Singer, M. (1997). Management science and business ethics. Journal of Business Ethics, 16, 385–395.
Smith, A., Stirling, A., & Berkhout, F. (2005). The governance of sustainable socio-technical transitions. Research Policy, 34, 1491–1510.
Stanley, J., & Stanhardt, B. (2014). Bigger monster, weaker chains: The growth of an American surveillance society. In R. Sandler (Ed.), Ethics and Emerging Technologies (pp. 269–284). London: Palgrave Macmillan.
Stilgoe, J., Owen, R., & Macnaghten, P. (2013). Developing a framework for responsible innovation. Research Policy, 42, 1568–1580.
Stirling, A. (2008). “Opening up’’ and “closing down’’: Power, participation, and pluralism in the social appraisal of technology. Science and Engineering Ethics, 33, 262–294.
Streiffer, R., & Basl, J. (2014). The Ethics of agricultural animal biotechnology. In R. Sandler (Ed.), Ethics, law, and governance in the development of robots (pp. 501–515). London: Palgrave Macmillan.
Teece, D. (2007). Explicating dynamic capabilities: The nature and microfoundations of (sustainable) enterprise performance. Strategic Management Journal, 28, 1319–1350.
Triulzy, G., Pyka, A., & Scholz, R. (2014). R&D and knowledge dynamics in university-industry relationships in biotech and pharmaceuticals: An agent-based model. International Journal of Biotechnology, 13, 137–179.
Tushman, M. (1977). A political approach to organizations: A review and rationale. The Academy of Management Review, 2, 206–216.
van de Poel, I. (2009). Values in engineering design. Philosophy of technology and engineering sciences. In Handbook of the philosophy of science (pp. 973–1006). Amsterdam: Elsevier.
van den Hoven, M., Lokhorst, G., & van de Poel, I. (2012). Engineering and the problem of moral overload. Science and Engineering Ethics, 18, 1–13.
von Schomberg, R. (2011). Towards responsible research and innovation in the information and communication technologies fields. Brussels: European Commission.
von Schomberg, R. (2013). A vision of responsible innovation. In R. Owen, J. Bessant, & M. Heintz (Eds.), Responsible innovation: Managing the responsible emergence of science and innovation in society (pp. 51–74). London: Wiley.
von Schomberg, R. (2014). The “quest” for the right impacts of science and technology: A framework for responsible research and innovation. In J. van den Hoven, N. Doorn, T. Swierstra, B.-J. Koops, & H. Romijn (Eds.), Responsible innovation 1: Innovative solutions for global issues (pp. 33–50). Dordrecht: Springer.
Wallach, W. (2014). Ethics, war, and robots. In R. Sandler (Ed.), Ethics, law, and governance in the development of robots (pp. 363–379). London: Palgrave Macmillan.
Wartick, S., & Cochran, P. (1985). The evolution of the corporate social performance model. Academy of Management Review, 10, 758–769.
Winner, L. (1986). The whale and the reactor: A search for limits in an age of high technology. Chicago: The University of Chicago Press.
Winner, L. (2014). Technologies as forms of life. In R. Sandler (Ed.), Ethics and emerging technologies (pp. 48–60). London: Palgrave Macmillan.
Wynne, B. (2006). Risk as globalizing discourse? Framing subjects and citizens. In M. Leach, I. Scoones, & B. Wynne (Eds.), Science and citizens: Globalization and the challenge of engagement (pp. 66–82). London: Zed Books.
Wynne, B. (2011). Lab work goes social, and vice-versa: Strategising public engagement processes. Science and Engineering Ethics, 17, 791–800.
Acknowledgements
The work reported in this article was partially conducted during the participation of the first and third author in the GREAT project. Funded by the Seventh Framework Programme of the European Commission, the GREAT project aimed at developing new governance frameworks for responsible research and innovation in the European Union. The first and third author thank fellow researchers in the GREAT consortium who contributed with valuable discussions regarding responsible research and innovation. The first and second author also thank the School of Economics and Business at Diego Portales University for funding the international seminar on business ethics in 2017. Many of the ideas that led to the integrative framework of ethical research and innovation reported in this article originated during conversations and discussions conducted during this seminar. We would also like to thank all the anonymous reviewers who participated in the review process. Their comments and suggestions greatly contributed to improving our article.
Funding
The work reported in this article was partly funded by the European Union's Seventh Framework Programme for research, technological development and demonstration under grant agreement N° 321480.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix A: An Illustration
Appendix A: An Illustration
The illustration presented in this appendix is modeled on the SPICE project, a project aimed at informing decisions for the development of climate geoengineering technologies (Stilgoe et al. 2013).
First Phase
In the first phase of inclusion, the strategists need to be defined. To this end, the delineation of the ecosystem of the project is assumed to consist of two research organizations and a research-funding agency as strategists.
Second Phase
During the second phase, the alternatives need to be elicited. We assume that the project is currently transitioning from “lab tests to a field trial,” as was the case with the SPICE project (Stilgoe et al. 2013, p. 1575). As their legitimacy, urgency and power have increased, the strategists now face not only the fierce opposition from nongovernmental organizations but also the need to respond to them (Mitchell et al. 1997). The alternatives are: (1) to continue with the field trial (\(a_1\)), (2) to go back to previous phases to conduct additional broader impact assessments and reengage with external stakeholders (\(a_2\)), (3) to put the project on hold until conditions for the field trial are propitious (\(a_3\)), or (4) to abort the project (\(a_4\)).
Once the alternatives have been elicited, the next step in this second phase is to define the criteria to be used to analyze the alternatives. The reflexivity criteria to be used are compliance (\(c_1\)), internal communications (\(c_2\)), external communications (\(c_3\)) and reviews (\(c_4\)). The anticipation criteria are environmental risks (\(c_5\)), social risks (\(c_6\)), economical impact (\(c_7\)), and political impact (\(c_8\)). It should be noted that the list of potential risks listed above as criteria under the dimension of anticipation is included here for illustration purposes and is not meant to be comprehensive. In general, this list will depend on the project at hand. In the case of the SPICE project, political, social and environmental risks were very salient, though technological risks in the area of geoengineering were also present. This is to be compared with other project types, such as those in the emerging Industry 4.0, where technological risks (e.g. in the area of cybersecurity) take center stage and would probably be included in the list of relevant criteria by stakeholders (Radanliev et al. 2020).
With the set of strategists, alternatives and criteria in place, the next step in this second phase is to elicit the weights of criteria for each strategist. To this end, we follow the procedure proposed by Thomas Saaty as part of the analytical hierarchy process (AHP), a widely used multicriteria decision analysis method (Saaty 1980). This method deploys the Saaty scale in order to measure the relative importance of one criterion over another.Footnote 15 To elicit the weights of criteria, we first multiply every value in each row and then raise the result to the power of 1/m, where m is the number of criteria. The resulting value for each row is then divided by sum of the values of all rows and gives the normalized weight for each criterion.
Applying this method, the profile of strategist \(s_1\) gives higher importance to compliance with existing norms, internal communications and the economic impact of the project, as shown in Table 5.
The profile of strategist \(s_2\) is shown in Table 6. As we can see, complying with existing norms, communicating with internal stakeholders more than with external stakeholders, and assessing the project economic impact are more important to the second strategist than anticipating environmental and social risks.
The profile of strategist \(s_3\) is shown in Table 7. Strategist \(s_3\) corresponds to a research-funding agency from the public sector. The third strategist is more concerned with the incorporation of external stakeholders and with the political risks of the project.Footnote 16
Once the set of alternatives, the set of criteria, and the m-dimensional vectors of weights (one per strategist) have been elicited, the evaluation matrix of alternatives containing the values \(v_{ji}\) that each alternative \(a_j\) delivers under each criterion \(c_i\) is generated. This matrix \(E_{mn} = v_{ji}\) is shown in Table 8.Footnote 17
Third Phase
The third phase is implemented using a multicriteria decision analysis method. We illustrate this process using the TODIM method proposed by Gomes and Lima (1991).
Using the evaluation matrix of alternatives \(E_{nm} = [v_{ji}]\), which is one of the outputs of the second phase of our methodology, the value function \(\phi ^{k.i}\) of TODIM computes a pairwise comparison of the values \(v_{hi}\) and \(v_{ji}\) that the pair of alternatives \((a_h, a_j)\) deliver under criterion \(c_i\) in the evaluation matrix of alternatives \(E_{nm} = [v_{ji}]\). The value function \(\phi ^{k,i}\) yields \(m \times l\) matrices \(\varPhi ^{k, i}\), the partial dominance matrices of alternatives containing the values \(\phi ^{k,i}_{hj}\) representing the partial dominance of alternative \(a_h\) over alternative \(a_j\) under criterion \(c_i\) for strategist \(s_k\), with \(1 \le i \le m\), \(1 \le h, j \le n\), and \(1 \le k\le l\).
The value function \(\phi ^{k,i}\) is given by the following expression:
The profile of each strategist is brought to bear in (1) by the weight \(w^k_i\) that each strategist \(s_k\) attaches to each criterion \(c_i\). Using the partial dominance matrices \(\varPhi ^{k, i}\) for each criterion and strategist, the final dominance matrix of alternatives is computed using the function \(\delta ^k\), with \(1 \le i \le m\), \(1 \le h, j \le n\), and \(1 \le k\le l\). Each of the l final dominance matrices is computed using the following expression:
Equation 2 generates l matrices of dominance of alternatives \(\varDelta ^{k}\), one for each strategist, containing the values \(\delta ^k_{hj}\) representing the dominance of alternative \(a_h\) over alternative \(a_j\) for strategist \(s_k\). Each one of these dominance matrices corresponds to the evaluation matrices of strategists (EMS\(_k\)), with \(1 \le k\le l\), which is the output of the first step of the third phase of the methodology proposed in the “Operationalizing the Framework” section of our article. Finally, the global value that each alternative \(a_h\) yields for strategist \(s^k\), with \(1 \le h, j \le n\), and \(1 \le k\le l\) is given by expression (3):
In order to generate the rankings as part of the second step of the third phase of our methodology, \(\xi ^k_{h}\), the global value that each alternative \(a_h\) yields for strategist \(s^k\), with \(1 \le h \le n\), and \(1 \le k\le l\), needs to be normalized as per expression (4):
The normalized global value of each alternative given by Eq. 4 leads to the ranking of alternatives for each strategist \(s^k\), which is the output of the second step of the third phase of the methodology set out in the “Operationalizing the Framework” section of our article.
Applying Eqs. (1) through (4), we obtain three evaluation matrices of strategists (EMS\(_k\)), with \(1 \le k\le 3\), and three rankings of all four alternatives, as shown in Tables 9, 10 and 11.
Different profiles of strategists lead to different evaluation matrices of strategists, which may lead to different rankings of alternatives for strategists. In the case of our illustration, the first and third strategist would prefer to continue with the execution of the project by postponing the field trial and investing more resources in the phases of reflexivity and anticipation. The second strategist, on the other hand, would prefer to put the project on hold. For all three strategists, even aborting the project is a better strategy than continuing on with the field trial, as originally planned.
The last step during the third phase of our methodology would be to generate a consensus strategy. If the plurality principle were to be applied,Footnote 18 then the second strategy would be the consensus strategy to be pursued. The strategists may agree to apply other rules to arrive at a consensus strategy (Fishburn 1973; Munda 2004), such as the Borda count (Borda 1784; Condorcet 1785; McLean and Urken 1995), which would lead to a different consensus strategy.
It is important to note that the methodology shown in Fig. 6 is iterative and dynamic in that at any given point in time, the flow of control can go back to previous phases to revisit decisions that have been already made, such as the inclusion of new stakeholders, which would require the flow of control to go back to the first phase, or the addition of new criteria or the modification of the profiles of strategists, which would require the flow of control to go back to the second phase. In the same way, other types of alternatives can emerge and be considered by the strategists by backtracking to the second phase of the lifecycle.
Our methodology does not endorse a particular multicriteria group decision analysis method. Different multicriteria group decision analysis methods can be used interchangeably to implement the third phase of our methodology. The choice will always depend on the type of multicriteria decision analysis problem at hand. While the TODIM method implemented in this case may be of interest to model the biases of human decision-making, especially those that arise in the domain of losses under deep uncertainty, in many cases the deployment of more computationally tractable methods, such as the TOPSIS method (Hwang and Yoon 1981), may be preferred, especially when considering large sets of criteria, alternatives, and strategists.
Rights and permissions
About this article
Cite this article
Paredes-Frigolett, H., Singer, A.E. & Pyka, A. A Framework for Ethical Research and Innovation. Sci Eng Ethics 27, 11 (2021). https://doi.org/10.1007/s11948-021-00287-9
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11948-021-00287-9