Science and Engineering Ethics

, Volume 19, Issue 1, pp 299–308 | Cite as

Extended Report from Working Group 5: Social Responsibility of Scientists at the 59th Pugwash Conference on Science and World Affairs in Berlin, 1–4 July 2011

Article

Social responsibility is a core element in all Pugwash activities. Since the first Pugwash conference the topic has from time to time been included on the agenda of Pugwash conferences and workshops. At the 59th Pugwash conference, held in Berlin from 1–4 July 2011, a working group met for thorough discussions of this subject. The working group had 20 participants from 11 countries. More than half of the participants had a background in physics, but also social scientists, scholars of the humanities and other natural sciences were represented. Ten papers were presented and discussed in the working group.

The deliberations in the working group revealed that responsibilities of scientists look different depending on which perspective it is approached from. Scientists, decision-makers and consumers may perceive social responsibility of scientists and technical experts quite differently. The issue of responsibility is complicated by highly charged differences in the perception of trends in and uses of science, engineering and technology. These differences materialise in particular scientific and technological projects, where normative and cultural features of certain scientific-technological developments come into play, which—once realized—challenge us to take a stand and to find appropriate approaches for coping with the ambivalence of these developments. This ambivalence may come to bear with regard to dual-use aspects of technology, the tension between pure and applied research, and the relation between the academic and the corporate worlds. The point of bringing in ambivalence of science is that it characterises a new emerging mode of techno-scientific practise. It is argued that the scientific condition is changing, science is becoming more ambivalent than before, and that we need to adjust our understanding of the social responsibility to this new mode of science, engineering and technology. The report takes up these points. In doing so it tries to distinguish between the individual, the institutional and the cultural dimensions of the social responsibility of scientists.

Perspective of Individual Scientists

Some scientists and technical experts, and most experts who attend Pugwash conferences, feel they have an individual responsibility to reflect on their role as scientists/experts and on the possible misuse of their work.

Social responsibility of scientific experts and misuse of research is a more complex issue than what immediately meets the eye. Projects that at first glance have no dual use might, when scrutinised more closely, be revealed as potentially ethically questionable.

Most scientists are not prepared for reflecting on their responsibilities as scientists: science education does usually not prepare experts for reflecting on the wider consequences of their work. This may make it easier for policymakers and other powerful stakeholders to orchestrate scientists to work for the formers’ agenda. Merchants of doubt, i.e. scientists who are paid to cast doubt on robust scientific conclusions regarding e.g. human health and environmental issues, illustrate this point.

According to the Uppsala Code of Ethics for Scientists a scientist can be defined as socially responsible if there is correspondence between the wider consequences of their work and the ethical principles that he or she holds.

To avoid a subjective or relativistic definition of social responsibility of scientists the term ethical principles needs clarification. Not all guidelines are ethical, and neither are all actions which follow these guidelines.

It was suggested that ethical principles need to be socially negotiated, and international treaties were mentioned as examples of ethical principles that scientists and technical experts’ actions must not violate. This suggestion was challenged by the view that scientists must rather compare the wider consequences of their work with their own ethical principles, as long as they (1) respect the norms of others to the extent that these deserve to be respected and (2) are followed consistently in similar situations (no double standards). A law can be unethical and therefore in need of being changed. It was noticed that social responsibility is a wider concept than ethics. Codes of conduct can also serve as ethical principles. It was mentioned that a code of conduct in nanoscience/-technology has been formulated and adopted by the European Commission. Steps towards implementation will soon be taken.

A number of examples of socially responsible scientists were given, and their actions categorised as conscientious objecting, giving policy-advice, informing the public, choosing research/development projects that are clearly ethical, whistle blowing, and/or civil disobedience. The actions of Sir Joseph Rotblat were mentioned several times by a number of working group participants. The case of Sir Joseph Rotblat constitutes an example of socially responsible behaviour of a scientist from which all scientists can learn.

Science based policy advice was discussed and highlighted as an area of science and technical expertise where scientific facts and advocacy mix. A few cases on scientific advice were shared among group members, including the experiences of the British Pugwash group, which tried to convey expert research-based advice to the British government on extending the lifetime of the UK’s Trident nuclear submarines. The then Minister of Defence later commented publicly that the ‘internal’ advice he had been given on this issue had been flawed. Interestingly, the final decision to go ahead with replacement has been postponed for another 5 years, thus vindicating the independent advice given at the time but ignored. It was pointed out that “speaking truth to power” no longer is an accurate description of the science-policy interface.

The case was made that scientific knowledge can justify certain kinds of advocacy: the consequences of WMD, the effect of tobacco on human health and potential consequences of climate change were mentioned as illustrations of this point.

The Working Group members agreed that the individual perspective on social responsibility does not suffice. The resistance of one scientist or technical expert will not change anything unless it hits public opinion and the agenda of policy-makers. This does not excuse individual scientists or technical experts from exercising social responsibility; it rather adds social, institutional and cultural layers to the discussions on social responsibility. The social responsibility of institutions that employ scientists and technical experts (corporations, universities, military and other governmental entities, international bodies etc.) also needs attention.

Social Responsibility from the Perspective of Users of Scientific Results and Technology

Consumers, decision-makers, the general public and other users of scientific knowledge and technological products are more focussed on results and products than on the value systems of the experts. They expect high quality advice, robust findings and safe products. Consumers do not expect mobile phones to cause brain cancer, bottled drinking water to be polluted or food products to be dangerous. They link social responsibility of technical experts to safe products, and hold the experts responsible if their expectations are violated. Hence, social responsibility of scientists and technical experts turns into a question of the quality of the scientific and technological production.

Emerging Technological Risks

Both individual scientists’ and consumers’ views on social responsibility of scientists rest on the premise that the wider consequences of scientific work and technological products can be foreseen. Individual scientists and technical experts can devote part of their work time to trying to assess the wider consequences of their work. However, such predictions are complex, uncertain and therefore very difficult to make. Individual attempts to map the wider consequences do not suffice, and need institutional underpinning.

The French College for the Prevention of Technological Risk that was operating from 1989 to 1996 was mentioned as an example of an institution informing the public and advising policymakers on the wider consequences of science and technology. The members of the college originated from different disciplines and professions and operated on a consensus basis. It produced 17 expert opinions and recommendations on a number of topics (excluding military issues, which they were not allowed to address).

Premises for a well performing institution fostering social responsibility were identified as (1) sufficient and continuous funding, (2) participation of scientists and others from different disciplines and professions, and (3) independence from special interests.

The issue of uncertainty associated with emerging technological risks was raised. It was discussed whether it is possible to reach a consensual estimate of technological risks, or whether the consensual approach represses expert disagreement by requiring experts to reach a consensus. It was argued that disagreement among technical experts is an under-researched topic that needs scholarly attention.

Different models of institutional entities assessing technological risks were proposed, such as ad hoc committees set up by national Parliaments and interdisciplinary university programmes with participation of external stakeholders. Criticism was raised against both models. A Parliamentarian committee might not be able to steer free of special interests, and a university programme might not be able to steer free of disciplinary quarrels. It was suggested that one could try to engage retired scientists in the assessment of technological risks. Their livelihoods would not be at stake, though the well-being of themselves and their family might be.

The Working Group members agreed on the need to set up/strengthen existing institutions that support social responsibility of scientists and technical experts. Individuals should not carry their social responsibility without institutional back up. It was stressed that these institutions should follow scientific principles.

Putting scientific principles to work cannot rely exclusively on the mores of academic science (results are based on empirical data, which are linked to scientific theories, the results written up and submitted to peer-review). It was suggested that the practise of science also encompasses an obligation to disseminate results to the public and to decision-makers, as well as an obligation to direct science and technology in directions that are expected to benefit human development. Science was not perceived as innocent, value-neutral or disconnected from ethics. Values can influence the practise of science as well as the acknowledgement of the validity of scientific results.

Secrecy in R&D and Whistleblowing

Scientists who publish results that violate military or commercial interests put their careers at stake, which may tempt them to hold back publication of inopportune conclusions. To illustrate this point reference was made to a recent communiqué from AAAS stating that a number of climate scientists have received death threats.

A number of whistleblower case-studies were shared among the group members. The cases discussed regarding missile defence R&D (involving Dr. Nira Schwartz, Dr. Subrata Ghoshroy and Prof. Theodore A. Postol as well as Roy Woodruff) illustrated how much courage it takes to practice social responsibility in a military setting. It was noted that the examples discussed highlighted the poor quality of missile defence technology, which diplomats referred to so positively during the Simons Symposium on European Security and Nuclear Disarmament on the first day of the 59th Pugwash conference. The case was made that diplomats and policymakers should update their technical knowledge related to the missile defence programme.

Group members argued that the cases discussed could be read as an indication that results generated by military R&D are in danger of being of lower quality than those from civil R&D, because the latter are subjected to more public scrutiny and peer-review. This claim was not documented; however, material documenting the poor quality of the missile defence project was systematically repressed. On the surface everything was claimed to work perfectly because the decision-makers “believed” in the project and did not want their belief to be shaken by adverse information. This still seems to be the case. Many working group participants explained the poor technological quality by the fact that results and products based on classified data cannot be checked by independent experts. Military R&D may not follow important scientific principles. This is why fraud and misconduct in military research are especially hard to detect. On the other hand, lack of transparency makes it impossible to know how common fraud is in military R&D. One consequence of this lack of openness is that policy-making regarding military issues might be based on dubious data, assumptions and interpretations, leading to potentially unsafe conclusions.

University physics research projects are increasingly being funded by military sources. Hereby secrecy and non-scientific principles materialise in university science. The US military budget funds a large portion of physics projects in industry and at universities, not only in the US but also in Latin American and European countries. The issue of secrecy in university science might be global, and one can only speculate whether the cases discussed by the group are representative of military funded R&D projects as such.

Competition, an intrinsic aspect of academic science, also encourages secrecy. Scientists may hold on to their research findings in fear of competitors who might steal them before they are published. Achieving recognition among peers, regrettably often a significant motor of academic science, may also generate secretive behaviour. Competitive funding systems and increased investments of private capital in universities enforce the competitive element of science, and hence the tendency to keep data and results under wraps. It was proposed that phasing out scientific prizes would be one way of countering the effects of competition.

Parallels with other research areas, e.g. social sciences and life sciences, were drawn: not only physics is heavily funded by military sources. Military funded university research and research results generated by university-private partnerships were compared and found similar: they both involve secrecy and their research results are not controlled by peer review. The collaboration between UC Berkeley and British Petroleum was mentioned in this connection, and the group’s attention was drawn to the scientific results published by Berkeley researchers during the oil catastrophe in the Mexican Gulf identifying a micro-organism that allegedly was said to decompose the oil without depleting oxygen. It was reported to us that this study was based on no more than 200 unrepresentative samples, a weak data base from which robust conclusions could not be drawn.

The understanding of knowledge production in settings characterised by secrecy is poor, and the working group recommends that increased attention is put to this apparently expanding mode of knowledge production.

It was acknowledged that it is difficult for individual scientists and technical experts to attract the attention of policy-makers and the media to findings regarding adverse effects, failed experiments or dysfunctional technology. They may also find it difficult to publish their results in scientific journals.

Hence, the working group recommended the establishment of institutions that could help whistleblowers and others to (1) verify their claims (2) get public and political attention for their findings and (3) provide some form of protection. Such institutions would need to be unbiased and free of conflicting interests. VDW’s whistleblower award was mentioned as an example of such an institution, and it was proposed that a similar independent institution be established at the European level.

Better legal protection of whistleblowers was also suggested. Currently the legal protection of whistleblowers differs from country to country. A common denominator of such legal regimes is that they rank lower than legislation regarding national security.

Some group members had lost their faith in today’s universities, as they were giving up their independence. Universities are frequently signing agreements with private companies. In these cases secrecy and corporate norms are likely to influence the university research agenda. Scientific societies might provide the institutional setting for protective institutions. Many engineering societies already have mechanisms that support whistleblowers. Other group members argued that we should not give up on universities, and they suggested that smaller universities might find a niche in setting up such entities. It was pointed out that in Germany a campaign is well on its way to keep universities free from military research. Some universities already have made commensurate pledges.

It was also noted that those subjected to whistleblowing quite commonly attempt to present whistleblowers as crackpots who are not to be taken seriously. Thus there is a need for peer-review as a first line of defence against such claims.

Dual Use of Science and Technology

One obvious example of dual use mentioned during the group’s deliberations is uranium enrichment, which can be used in both nuclear weapons and in civil energy production. Software that can be used to protect actors in civil society against hackers and other criminal acts in cyber space and in warfare is another. Monsanto’s Agent Orange used in the Vietnam War, and its civil counterpart is a third.

The products and the organisation of military and civil research quite often and perhaps increasingly mix, it was said, and the boundaries between these two genres are blurred. Reference to the triple-helix military-industrial-university complex was made (Monsanto’s collaboration with Blackwater is one example mentioned), and again it was noted that this mode of knowledge production is poorly understood, and analyses of it were called for.

Many of the group participants found military research and development unethical, and suggested means to limit the negative dual use of technologies by:
  • Changing research funding priorities and direct military funding towards research for human development. Reference was made to the already mentioned campaign calling for long term commitment from universities to keep them free of military funding. Researchers are encouraged to go where the money is, though many researchers do not know from where their research funding originates. Increased attention on the origins of research funding was suggested.

  • When it is possible to foresee the consequences of research and development, including dual use, one should try to disentangle the civil and the military parts, so that emphasis can be made on the civil parts of science and technology. Researchers might not talk about potential military uses of civil research and in that way limit the military use.

  • Increased transparency in the research process was called for to facilitate the categorisation of R&D as either military, civil or dual. This would also ease the protection of whistleblowers as well as have an effect on the quality of military research.

It was mentioned that research funded by military sources may not be presented at meetings of, or in journals of, Japan’s Physical Society. This example again highlights the role of scientific societies.

Science and Technology for Human Development

Structural support facilitating researchers to actively choose socially responsible research projects was illustrated by European funding of nanoscience and—technology. The case illustrates that changed funding priorities can support socially responsible production of knowledge and technology. Two categories of socially responsibly research funding were mentioned: (i) funding for research that aims at disclosing negative and unforeseen consequences of nanotechnology and (ii) funding for research that actively aims at human development, e.g. R&D projects that aim at realising the UN Millennium Goals. Socially responsible research can, in other words, both ensure safe products and contribute to human development.

The increased allocation of funding to socially responsible projects in nanoscience and -technology was explained by a fear of repeating the same mistakes that were made in biotechnology and technology development related to GMOs, where lack of user-inclusion has generated public mistrust in e.g. GMO food products.

It was noted that it is important to hold researchers, who receive research funding that aims at contributing to human development, accountable for the impact of their research, so that hype and overselling of future impact are avoided. When the intended consequences are not accomplished attention should be devoted to activities that try to map the barriers that prevented the expected outcome.

Uncertainty

The earthquake and the following tsunami that caused the Fukushima nuclear accident, and its relation to the social responsibility of the involved scientists and technical experts, were also discussed in the working group. The Fukushima event is an example of unforeseen catastrophic incidents. How do we handle such incidents responsibly?

Scientists and technical experts are expected to identify and communicate early warning signals. The scientific community is expected to carry out peer review on such signals so that faulty warnings are filtered out. If warnings are considered real, technical experts involved in existing safety arrangements are expected to react appropriately to the warnings. If the probability of a risk materialising is considered to be insignificant, a responsible action might be to consciously neglect the risk, and at the same time inform the public and decision-makers of this conclusion. Before the Fukushima accident early warnings were given, but were ignored.

The group discussed whether unlikely accidents can be realistically foreseen, and some group members had the impression that disastrous events can rarely be predicted in a reliable way. However, it was acknowledged that scientists should try to do so. Uncertainty and unforeseen consequences are part of dealing with high tech solutions in a responsible way.

Once an accident has occurred scientists and technical experts are expected to help in the disaster management. As was mentioned above, a society can be prepared for such management by providing for a multi-facetted routine for dealing with the unexpected. General emergency preparedness in the Japanese society seems to have functioned sufficiently to prevent a kind of panic which would have turned the large-scale technological failure into an all-encompassing social catastrophe.

The accident needs to be monitored and the generated data should be made available so that conclusions regarding the management of calamities of this kind can be checked, and the public and decision-makers in other countries can be informed about ways and means of dealing with a catastrophe of this magnitude. Not all working group participants were satisfied with the monitoring of radiation levels etc. that the Japanese carried out in the affected areas. Also the restricted transparency and sharing of the collected data was criticised by some working group participants.

The political consequences of the Fukushima accident were discussed, but participants did not come to agreed conclusions.

Science Education

Several working group participants mentioned that education of scientists and technical experts is an important instrument to prepare the future generation of scientists and technical experts for socially responsible conduct. It was suggested that a database be complied of examples of scientists and technical experts who are claimed to have exercised socially responsible conduct. These examples could be used in science education to encourage the appreciation of social responsibility of our emerging scientists and engineers. Some group members felt that our current system rewards socially irresponsible conduct, and that the composition of examples to be used in science education might contribute to change that. Many scientists and technical experts are not aware of dual use issues. It was proposed that science and engineering education should also address this topic.

Cultural Aspects

Science, engineering and technology are rooted in culture, and it has been increasingly argued that cultural assumptions plus values shape scientific and technological practises. Several participants in the working group argued that we cannot solve the problems of our times, e.g. reach the UN Millennium goals, if we only work within the existing dominant cultural framework. We need new lifestyles and new ways of thinking. The cultural perspective on the social responsibility of scientists was illustrated in a presentation of peace research that directed the group’s attention to the basic assumptions on which research and development is based. Peace research was presented as a response to the academic field of strategic studies, which was considered to reproduce values and assumptions held by the military and the political establishment. A first step to initiate cultural change, it was argued, is to be transparent regarding the cultural aspects (assumptions and values) of research and development and where they originate. These reflections were translated into a redefinition of the social responsibility of scientists. A scientist, engineer, scholar or knowledge worker holds a responsibility to reflect upon the character of academic work as a social practise. This definition does not link social responsibility of scientists to certain material outcomes.

Recommendations

One overall conclusion of this working group was that the character of science, engineering and technology is changing. Research activities are becoming ambivalent in a number of ways: civil and military research, pure and applied science, academic and corporate knowledge production are becoming increasingly difficult to separate. In this new context, the working group recommends to the Pugwash Council that Pugwash should reinvigorate and extend its activities in clarifying and discussing social responsibility of scientists.

The design and characteristic traits of the institutional support that socially responsible scientists require need clarification and expansion. Working Group 5 suggests that a workshop is organised—possibly in Argentina—to explore the social responsibility of scientists in the changing science and technology environment.

The working group furthermore recommends that Pugwash establish a study group on this topic. Initially the group will prepare a paper based on the extended version of the working group report by adding relevant references, expanding on the examples and sharpening the arguments. The working group proposed that it should establish itself as a de facto Pugwash study group, and will initiate its work at once. It is recommended that a Pugwash email list is established to facilitate the study group’s work.

The working group also very strongly recommends that more world-affairs-related scientific issues are dealt with in plenary sessions at the next ‘annual’ Pugwash Conference on Science and World Affairs, as well as in future years.

Copyright information

© Springer Science+Business Media B.V. 2011

Authors and Affiliations

  1. 1.Department of Learning and PhilosophyAalborg UniversityBallerupDenmark

Personalised recommendations