Science & Education

, Volume 23, Issue 2, pp 485–507 | Cite as

Reframing and Articulating Socio-scientific Classroom Discourses on Genetic Testing from an STS Perspective

  • Dirk Jan Boerwinkel
  • Tsjalling Swierstra
  • Arend Jan Waarlo
Open Access


In recent decades, Science & Technology Studies (STS) have revealed the dynamic interaction between science and technology and society. Technology development is not an autonomous process and its artifacts are not socially inert. Society and technology shape each other. Technologies often have ‘soft impacts’ in terms of unpredicted side effects on individuals and society. Nevertheless, current societal discourse on technological innovations is still dominated by ‘hard impacts’ such as quantifiable risks for health, safety and the environment. Furthermore, participants in socio-scientific discourses often underestimate their agency in influencing technological innovations, and at the same time overestimate their freedom of choice to use a technology. Past debates on technological innovations have shown how these debates were framed and often caught in fruitless discourse patterns or arguments. Interventionist STS research experiments with solutions to this problem. Assuming that an STS perspective is helpful in reframing and articulating socio-scientific classroom discourses, the case of genetic testing is used to explore this. An important positive ‘hard impact’ of genetic testing is disease prevention. However, this is put into perspective by addressing ‘soft impacts’ such as limited access to certain careers based on genetic risk and changes in the conception of health and the perception of responsibility for one’s health. Discussion stoppers such as ‘playing God’ or ‘We can’t stop technological advancement’ can be challenged through uncovering underlying assumptions. The use of narratives and future scenarios in classrooms seems fruitful in provoking imagination and engaging students in public debates on technological innovations.

The message of the gene appears to be equivocal. It promises both certainty, order, predictability, control and helplessness, limitation and acceptance of biological fate. It can stress both similarities and differences. It can discover human relations and at the same time break them off (Ten Have 1995, p. 29; translation from Dutch).

1 Introduction

In the 1990s, the top-down implementation of biotechnology encountered intense public scrutiny and vehement social resistance (e.g. Gaskell and Bauer 2001). The gap between developers and users of science-based technology became obvious. This resulted in a paradigm shift both in Science & Technology Studies (STS) (Wynne 2006) and in Science Communication (Bucchi 2008). The power of reciprocity was recognized, and a new vocabulary arose with terms like co-production and co-evolution of science, technology and society, early anticipation, interaction, dialogue and participation.

In the last decade, this ideological turn was followed by designing and studying social interaction of stakeholders so as to understand and improve the societal embedding of emerging science and technology.1 Science education and communication are crucial in empowering people to participate in the societal debate on applications and implications of technologies. For example, the Dutch DNA Labs on the Road are a joint activity of genomics centers and universities targeted at upper secondary education (Van Mil et al. 2010). This outreach activity includes opinion-forming on socio-scientific issues going with genomics applications.

The authors jointly cover the fields of STS, Science Communication and Science Education. In this article, we will discuss new insights from STS, including Science Communication, and explore how this might inform science and technology education for citizenship in a democratic society. The exemplary focus will be on genetic testing, because at some point in our lives, all of us will likely be confronted with this socio-scientific issue. Within this topic, we will further concentrate on technology (in the making) and on the various impacts of genetic testing. We will also examine what we can learn from past debates on new and emerging technologies, in particular how debates are framed and often caught in fruitless discourse patterns or arguments (Stemerding et al. 2010) and how this might inform and improve socio-scientific classroom discourses.

In summary, the following research question will be addressed in this article: How can STS research on the dynamic interaction between science, technology and society contribute to reframing and articulation of socio-scientific classroom discourses on genetic testing?

In answering this question, we hope to contribute to a more realistic image of technology for students and to equip them for individual and collective decision-making concerning the development and use of technology. In doing so, we aim to enhance their agency.

This paper is part of an ongoing research project on genomics education for citizenship and is aimed in particular at theoretical rethinking of science education and providing educational (re)design criteria. Our context is Dutch education. However, we draw widely on English literature and believe that our work has wider applicability beyond our own context.

2 Co-evolution of Science and Technology and Society

For a long time, science and technology were believed to develop almost autonomously, without much interference from society (Smith and Marx 1994). In the words of the famous motto of the Chicago World’s Fair in 1933: ‘Science Finds, Industry Applies, Man Conforms’ (Meijers 2009, p. 48). Traditionally, this technological determinism was based on the assumption of an autonomous, quasi-Hegelian, scientific or technological logic unfolding itself (see, for example, Ellul 1964; Postman 1992). Nowadays, the justification of this determinism is usually somewhat more political: technology cannot be steered or controlled because of the relative powerlessness of any single nation state in a globalizing economy (Mitcham 1994; Swierstra and Rip 2007). An additional often-heard argument is that technological development cannot be stopped because of the psychological mechanism that somehow forces consumers always to desire and demand what has become technologically available (Postman 1992).

In this determinist perception, humans are powerless to influence the course of scientific and technological development. Their power supposedly lies elsewhere. Although incapable of influencing the process of scientific and technological development, humans are supposed to have complete power over the products thereof. Technological artifacts are depicted as passive and docile instruments that do not influence the user in any way. The typical argument here is that every technology can be used for good or bad. Therefore, if something goes awry, only the user is to blame, not the instrument, like the American bumper sticker that reads: ‘Guns don’t kill people; people kill people.’ This perception of technological artifacts as lacking any agency or societal impact of their own is referred to as the ‘instrumental’ perception of technology (e.g. Waelbers 2011, pp. 16–19).

As a result of the popular determinist and instrumentalist perceptions, to this day science and technology on the one hand and society on the other hand are often perceived by scientists, technologists, policy makers and the larger public to be two separate entities (‘science and technology and society’). However, this ‘non-interaction’ view has been refuted in recent decades by an expanding body of theoretical work, originally primarily in the field of the philosophy of technology2 but later also in the history, sociology and ethnography of science and technology. The perception of technology as autonomously developing has been refuted by authors working in the tradition of so-called Social Construction of Technology or Actor–Network Theory.3 These authors have demonstrated minutely with a host of historical and sociological evidence how social power and social values deeply influence the scientific and technological research agenda, as well as the designing and subsequent social embedding of science and technology (‘science and technology in society’).

Nowotny et al. (2001) argued that, in modern times particularly, the boundaries between science, technology, economy and society have eroded in the sense that modern science and technology development is more and more determined by the promise of future applications. Thus, science, technology and society are not separate entities but are fundamentally entangled. Furthermore, STS have shown that technological artifacts are not to be understood as neutral and passive instruments but rather as value-laden and active forces, as ‘agents’ that influence how we live, how we act, how we relate to each other, how we understand the world and our place in it, and what we aspire to, desire and hope for (Keulartz et al. 2004; Verbeek 2005). To return to the bumper sticker: indeed, the gun may not shoot someone on its own, but its availability certainly does influence how the user thinks and acts. In other words, the users are influenced by the technologies they use. Science and technology co-shape our existence.

If we combine these two separate criticisms of the standard perception of the relation between science and technology and society—that is, society influences science and technology, and science and technology influence society—we identify a phenomenon that has been referred to as either the co-evolution (Rip and Kemp 1998) or the co-production (Jasanoff 2004) of science and technology and society.

2.1 Persistent Images of Technology Limit Public Discourse and Participation

The notion of co-evolution or co-production requires that science and technology exert an enormous influence over the lives of the citizens of modern ‘technological cultures,’ but that these citizens are not condemned to passivity: it is possible to influence and shape the scientific and technological developments that shape us. Citizens can, for example, influence the way problems, for which technology is supposed to provide a solution, are defined, by providing alternative formulations (Zwart et al. 2006). For instance, hunger may be defined as a technical problem, that is, as a lack of nutrients, for which genetically modified rice can then provide the solution. But it may also be defined as a political problem for which more democracy and better distribution of food is the solution.

Furthermore, citizens can compare and assess alternative social and/or technological means to solve the problems, or they can come up with ways to use the available technologies that are quite different from what the technology developers had foreseen (Oudshoorn and Pinch 2003). Of course, their influence is not unlimited. It is restricted by existing power structures and by technological decisions in the past. It is not easy to switch to electric cars when the material infrastructure, the identities of the consumers and the interests of producers and others have all evolved around the petrol car. Of course, it has to be accepted that, faced with ‘socio-technical systems’ in which social and technical routines are deeply intertwined, the influence of the individual citizen is often negligible. However, the same is true for democratic politics in general, and this is rarely accepted as a good reason for citizens to give up the attempt to improve society (Collingridge 1980).

Thus, technological development is not autonomous but can be influenced by the citizens that bear the consequences—good or bad—of those technologies. In many countries, especially in western Europe and the USA, recent decades have therefore witnessed experiments with engaging the public in discussions on where (not) to go with science and technology. For example, a country like the Netherlands has organized ‘broad societal dialogues’ on—amongst others—nuclear energy, genetics and food, and recently on nanotechnology ( To a large extent, these initiatives were a response to the heated controversies accompanying the introduction of genetically modified organisms (GMOs) in western Europe during the mid-1990s (e.g. Gaskell and Bauer 2001).

Initially, these attempts to involve the general public had the character of a unilateral communication campaign to inform the public (‘public understanding of science’), so as to take away their supposedly irrational fears. This approach, nowadays referred to as the ‘knowledge deficit model’ (House of Lords 2000), has by and large been unsuccessful, as many members of the public protested against having their opinions dismissed as ‘irrational’ and ‘emotional’ (Wynne 2001). Recent years therefore have seen attempts at a more symmetrical approach, where the need for an open dialogue between experts and the general public is stressed. This approach is usually referred to as ‘public engagement’ or ‘public participation’ (Bucchi 2008; Dillon 2011).

However, as yet, the public participation approach has not been particularly successful. In practice, it is proving hard to take leave of the ‘knowledge deficit model’, where experts, and sometimes the audience, seem to agree with this and see it as their role to ‘provide the facts’ and ‘stay clear of values’, rather than engaging in an open dialogue about the values that are already part of their scientific and technological research, and open themselves up to possible facts that the public might have to offer (Wynne 2006; Radstake et al. 2009; Verhoeff and Waarlo 2011). However, perhaps more fundamentally, many experts and policy makers still maintain their mindset about the relationship between science and technology and society. They remain convinced that little influence can be exerted on the course of scientific and technological development: ‘Progress cannot be stopped’ (Shelley-Egan 2011; Wynne 2006).

2.2 Discourse on Technology is Limited to Risk and Neglects Soft Impacts

Technologists, policy makers and citizens tend to systematically underestimate the agency of technologies, as they still cling to ideas about technology basically being a passive instrument. As a result of this instrumentalist perception of technological artifacts, the agenda of these dialogues tends to remain fairly restricted. This dominant perception of technology as essentially an instrument pre-structures the debate in two ways. Firstly, there is more attention given to negative than to positive impacts. How a user applies an instrument is not a matter of public concern: they are left to their own devices, on the condition that in doing so they are not harming anyone else. Only when technology use harms other stakeholders does the technology become the subject of public debate and intervention. This explains why we have ‘risk assessment’ but not something like ‘happiness assessment.’ In other words, the dominant approach towards technology is ‘what to avoid’ rather than ‘what to achieve.’ Positive impacts such as benefits are privatized, and only negative impacts such as costs acquire access to the public agenda. As a result of this bias, it is very hard to initiate public deliberation on a question such as ‘Do we want this technology, even if it is not risky?’ or ‘What technological alternative would we prefer?’

Thus, public discussions will quickly gravitate towards the issue of ‘risk.’ ‘Risk’ could in principle be broadly defined as the chance that something undesirable will happen, that is, the chance that an important value may be harmed. However, in practice, the meaning of the concept is much reduced. Of all the values that can be threatened by a technology, policy makers and technology actors tend to focus exclusively on three: the chance that a technology will adversely affect our safety, our health and/or the environment (Swierstra and Te Molder 2012). Of course, citizens are also interested in this set of values. However, they also care about other values, as is evidenced by the GMO controversy. Citizens would, in the public debate, often refer to other values, such as naturalness, a non-instrumental relationship with plants and animals, respect for Creation, distributive justice and so forth.4

The difference between risks, or ‘(negative) hard impacts,’ and ‘soft impacts’ (Swierstra and Te Molder 2012) is a gradual one, and can be mapped along three dimensions. Firstly, risks typically involve non-controversial values, such as physical harm. In the case of soft impacts, there is typically less consensus on whether harm really occurs, for example, does Facebook make a travesty of friendship (Turkle 2010) or does it simply bring forth a new form of friendship? The mobile telephone has changed social practices and expectations with regard to being ‘reachable’ all the time, but is this progress or not? Secondly, risks are typically quantifiable: there is x chance that y people get hurt or the environment gets polluted with z. In the case of soft impacts, as the Facebook example illustrates, impacts tend to be qualitative rather than quantitative. Lastly, in the case of risk, it is possible to connect the negative impact through a direct causal link to a specific technology, and thus to hold the developers or producers accountable. In the case of soft impacts, the impact is typically co-produced by technology and the user, and thus developer and user quarrel about who is responsible for what (Van de Poel 2007).

2.3 Technology Can Evoke Moral Change

A very special type of soft impact is that science and technology not only help to realize existing moral values, such as health, sustainability or safety, or conflict with them, but also evoke moral change (Swierstra et al. 2009; Boenink et al. 2010). A clear relationship between technological and moral change, can be found in the domain of sexual morality. The pill, for instance, by severing the link between intercourse and reproduction, not only enhanced women’s sexual autonomy but also contributed to more hedonistic conceptions of sex, and subtly changed the connotation from ‘receiving’ children as a gift to ‘having children’. The pill also had unforeseen consequences for the position of homosexuals. As this new technological device enabled heterosexuals to have intercourse without reproduction, the traditional moral objection against (male) homosexuals that they wasted their semen could no longer be upheld.

With the increasing success of new reproductive technologies such as in vitro fertilization (IVF), the established human right to create a family has subtly changed from a negative freedom right into a positive claim right. David Hume famously pointed out that ‘ought implies can’ (Hume 1987, p. 521), but in a technological culture, this dictum can also be read backwards: new technological ‘cans’ often imply new moral ‘oughts’—now we are able to do it, we are under a moral obligation to do it.

The influence of technology is, of course, not restricted to sexual morality. New technologies disclose the world in novel ways, often with moral changes as a consequence. Modern media, from newspapers and television to Twitter and Facebook, have made us aware of the predicament of people in places on the other side of the world, thus enabling them to appeal to us for help. Thanks to modern technologies such as computers, we have become aware of human-driven climate change, thus creating a whole new category of previously unheard of responsibilities and obligations with regard to sustainability.

On a more abstract level, our growing power over the natural world has led to a subtle change in the way people justify their actions. When one has little power over the world, one cannot vouch for the consequences of one’s actions. What one can vouch for are one’s intentions—the outcome of one’s well-intended actions are left to God or fate. This world view corresponds with a more deontological approach in ethics, focused on doing one’s duty regardless of the consequences. But as technologies are steadily increasing our power over the consequences of our actions, consequentialist justifications of moral choices grow in prominence. It is no longer enough to do one’s duty; one is now also increasingly held accountable for the consequences of one’s choices. And finally, starting with Galileo’s telescope, a whole host of scientific instruments have helped reshape our vision of what humans are, what their place in the world and order of things is, what the earth is, what makes humans flourish and what constitutes a good life. In summary, our ideas about good and bad not only inform and guide scientific and technological research and development but are also shaped by them.

Modern societies are gradually learning to deal with (negative) hard impacts by developing procedures and institutions to deal with hard impacts—through risk assessment, but also by installing legal frameworks to distribute accountability in case such hard impacts materialize. Unfortunately, they are as yet far less successful when it comes to dealing in constructive ways with soft impacts. Both technology actors and policy makers tend to dismiss these soft impacts; they are somehow too intangible to merit serious attention. Soft impacts typically get removed from the public debate and relegated to the private sphere, as they are considered highly personal, subjective, lifestyle issues.

This unwillingness to deal with technology’s soft impacts often is a cause of major discontent, because—as in the case of the knowledge deficit model—large parts of the population feel that their concerns are not taken seriously. To quote Engel, partners in communication do not only have ‘a need to know and understand, but also a need to feel known and understood’ (Engel 1988, p. 114). There is no linear relationship between soft impacts and the evolution of public controversy, but dismissing latent concerns about soft impacts easily engenders outbursts of public discontent later in time. By then, repeated experiences and cumulated irritations have replaced the early, largely invisible and not necessarily negative concerns (see, for example, Marris 2001). Technologists and policy makers should learn to recognize that, for instance, religious critiques (‘playing God’) might also pose questions about the limits of science (Wynne 2001). The paradigm case here is again the Monsanto debacle of the mid-1990s. Much of the public concern seemed to regard the hard impacts of modified crops—environmental risks and health concerns—but these concerns often sprang up from other concerns about soft impacts, for example that genetic modification exemplified technological hubris, or that it increased the power of big corporations over small farmers (see also Joly and Marris 2001).

The intense public scrutiny and vehement social resistance led to a paradigm shift in STS, including Science Communication. Co-evolution or co-production of science and technology and society became the new view of technology. The mutual or reciprocal influence was recognized and shaped into stakeholders dialogues, for example on future scenarios. These dialogues include sharing knowledge and concerns, and enable participative decision-making. Table 1 summarizes the different views of technology.
Table 1

Views of technology (in the making), characterized by perceptions of societal interaction and agency and by framing of societal discourse

View of technology

Perception of societal interaction

Perception of agency

Framing of societal discourse


Science and technology develop autonomously

Individuals and collectives are powerless in steering technology

Hold on technology development left out of debate


Societal values inert to technology (in the making)

Individuals and collectives fully in control of use of artifacts

Only risks of technology count; soft impacts left out of debate


Technology and society shape each other

Individuals and organizations can act as stakeholders; participative decision-making

Sharing knowledge and concerns; scenarios include soft impacts

The inclusion of future scenarios in the discourse is sometimes dismissed with the argument that the future cannot be predicted, as it is essentially open and dependent on human choices and contingent circumstances. As a result, any discussion of future impacts of existing technologies, not to mention emerging technologies, can only be speculative. Some people will for this reason avoid or devalue the discussion on impacts, asserting that they simply prefer to ‘stick to the facts’. But this argument is incoherent for at least three reasons.

Firstly, this skepticism is usually reserved for soft impacts only, leaving risk assessment aside. The implicit suggestion is that talking about risks is somehow less speculative than talking about other impacts of technology. Secondly, what is usually forgotten is that there is no technology development without the developers stating possible aims and goals for the new technology. New technologies are invariably born in a cradle of promises and expectations. More often than not, with hindsight, we have to admit that these were inflated—sometimes intentionally so to mobilize financial and political support (Van Lente 1993). Thus, if the proponents of a technology are allowed to speculate on the future, why then not allow the same leeway to opponents and skeptics? And finally, and most fundamentally, avoiding speculating about the future is simply not an option. Any goal-oriented behavior requires some speculation, as goals by definition lie in the future. Thus, society’s aim should not be to avoid speculation with regard to technology’s impacts but to make speculations more realistic.

What do these considerations add up to when we want to reflect on socio-scientific classroom discourses on genetic testing? From experience, we know that classroom discourses on new and emerging science and technology show resemblance with the larger public debate (e.g. Waarlo 1998, 1999). For example, the implicit instrumentalist and determinist attitudes towards technology are reflected in ‘discussion stoppers’ such as ‘it is everybody’s own choice’ and ‘you cannot stop progress’.

The goal of conducting such classroom discourses is to prepare students for their citizenship roles in the context of a technological culture. Thus, the questions that we want to explore in the following sections are: How can we make students aware of the mutual shaping of science and technology on the one hand and society on the other? How can we clarify and reflect on their spontaneous attitudes and underlying values? How can we give students confidence that they can in fact co-shape the course of scientific and technological development? And how can we create room for the deliberative exploration of how technologies shape their lives, that is, not only look at failing or destructive technologies (‘hard impacts’) but also at technology’s creative side (‘soft impacts’), including the phenomenon of techno-moral change? The next section will analyze socio-scientific issues related to genetic testing from the perspective of the co-evolution of science and technology and society. After this, the relevance of this perspective for education on socio-scientific issues will be discussed, with a special elaboration of education on genetic testing.

3 STS View on Genetic Testing

When discussing the impacts of genetic testing, we should distinguish between the testing activity itself and the decisions made on the basis of test results. The testing activity itself has few impacts, as its role is to determine the genetic makeup of an individual, not to change it. Legislation on genetic testing is therefore different from legislation on medical treatments, in which strict regulations are formulated to prevent negative side effects. This might lead people to conclude that genetic testing has no serious impacts, because it merely provides information. Whereas societal discussions on GMOs were vehement and still continue, the development of new testing practices causes less societal concern. However, the decisions based on this information have many impacts, and these will be discussed below.

3.1 Uses of Genetic Information

Genetic testing originated in clinical genetic practice. Genetic tests provide information that can be used for different purposes. By testing patients, the genetic information can be used to diagnose a disease and thus start medical treatment if possible. In genetic counseling, this information is also used to predict the chance that a disease present in the family will recur in that family or to detect possibly affected family members. After having been informed about a condition running in the family, these relatives can decide whether or not to have a test and act upon the test results, which may affect reproductive choice and lifestyle adaptations or may even result in preventative surgery. In these cases, the options are presented by medical professionals, but the decision-making is up to the patients and their relatives. Generally, the informed relatives are not patients (yet) but become aware of their genetic risk factors. Genetic testing can be done at different stages in human development, from pre-conception (informing future parents on the risk of having an affected baby), to pre-implantation (selecting embryos for implantation), pre-natal (detecting genetic anomalies during pregnancy) and perinatal (detecting diseases that can be prevented or alleviated by early treatment).

Genetic testing of healthy people is standard practice in genetic screening. Genetic screening is mostly done with large groups of the population, with the aim of early detection of genetic risk factors and disease prevention by early treatment (Health Council of the Netherlands 2008). The heel prick is an example in which the government decided to test all newborn babies for genetic diseases, the effects of which can be prevented by early treatment. A recent development is that screening decisions can also be made by individuals, for example future parents who want to be informed about their carrier status for various diseases (pre-marital and pre-conception screening). In most cases, these genetic tests concern monogenetic diseases with a high prediction reliability, but this is changing because of new-found relationships between the genome and health risks, which produce less reliable predictions, or risk indications that depend on environmental conditions, such as extreme physical demands.

3.2 Hard and Soft Impacts of Genetic Testing

Diagnostic testing and genetic screening have important effects in terms of preventing disease and death, for example, as part of IVF treatment by the selection of embryos without life-threatening gene variants. The preventative effects are quantifiable, for example by calculating the number of cases of diseases prevented by the heel prick. This means that they count as typical hard impacts. Also quantifiable is the number of persons tested versus the number of detected patients. In 2007, 185,000 Dutch babies were tested by heel prick, resulting in the prevention of 230 severe cases of disease (Lanting et al. 2008). Because of false-positive results, hundreds more cases were positive in the first test without having the gene variant. This means that another impact of genetic testing is the burden of extra testing and unnecessary anxiety in parents. This anxiety caused by false-positive results can be considered as a soft impact of genetic testing.

Other soft impacts of testing for genetic risk factors can have dramatic personal consequences, such as athletes being excluded from participation based on a proven elevated risk (McNamee et al. 2009). Similar consequences in other selection procedures can be imagined in the future. In a society that strives to eliminate every risk, the development of risk-predicting genetic tests is an important instrument to prevent risk by the selection of people with good gene variants (Vos 2008). In these cases, the positive hard impact of preventing death or disease is put into perspective by the negative soft impact of limited access to a particular career for individuals with an elevated risk. When the frequency of risk-increasing alleles is unevenly distributed among different parts of the population depending on their origin, ethnicity also comes into the genetic discussions. An example is the testing for the sickle cell allele in American college athletes (Science Daily 2010). Sickle cell anemia is much more common in African–American athletes (Kark et al. 1987) who for this reason run a higher risk of being excluded from participation.

The inclusion of genetic risk factors with lower prediction reliability in genetic testing practices can also lead to a third kind of soft impact: changes in the concept of health and in the perception of accountability. As a high proportion of the population are carriers of at-risk genotypes, more predictive testing can result in many healthy people becoming worried about their health (‘worried well’), leading to medical overconsumption (Clarke et al. 2009; Greely 2011). This can have negative hard impacts such as unnecessary treatments or follow-up diagnosis. The availability of more genetic test options also brings about morality changes. When the birth of a child with a hereditary disease can be prevented, parents who choose not to test will feel social pressure to account for this choice. These parents can be judged by society, and possibly even by their own child. The solidarity on which health insurance is based can also change when an expensive treatment can be prevented by genetic testing (Godard et al. 2003).

A final soft impact to be discussed here is the negative implication of knowing your genetic risk status. In the practice of ‘cascade screening’, the relatives of a patient diagnosed with a genetic disease are informed about their possible risk status, which can be determined by a genetic test. The test offer is accompanied by the advice that, if in the process of negotiating life insurance, it is wise to wait to be tested until after the deal has been closed. Insurance questionnaires about health can thus be answered truthfully in terms of not knowing your genetic risk status. Knowing your genetic predisposition could not only have financial consequences but also have psychosocial consequences in terms of loss of well-being or enjoying life (living with a time bomb, affected identity) (Messner 2011). The ever cheaper and quicker sequencing methods make it plausible that complete genome sequencing will become a part of standard medical practice (Health Council of the Netherlands 2010). Instead of the question of what genetic information we want to know, the question acutely becomes what genetic information we do not want to know (Elias and Annas 1994; Health Council of the Netherlands 2010).

We can conclude that, ironically, together with its positive hard effect in terms of preventing disease and death, genetic testing generates various soft impacts. The latter concern the stress caused by false-positive test results, limited access to certain careers based on genetic risk, increased demand for genetic testing and subsequent prevention measures, changes in the conception of health and the perception of responsibility to prevent risks, and the negative effects of knowing your genetic status.

3.3 Societal Influence on Genetic Testing

Genetic testing technology clearly impacts on society, but how does society influence the technology of genetic testing? The development of this technology passes through different phases: basic biomedical research on the correlations between the genome and a disease, the translation of research findings into clinical practice (diagnostic test), and implementing the test in medical practice or making it available for private use (Fig. 1).
Fig. 1

Societal influences on the development and availability of genetic tests

The first and second phases are carried out within the medical regime (Nelis 1998). New gene variants related to disease are found in biomedical research. This research is not aiming to develop genetic tests but to understand the relationship between the genome and a specific disease or behavior. The decision to include these variants in genetic tests is a second step. The tests can be used for diagnosis of a patient or a person at risk, or for screening of large numbers of the healthy population, such as the heel prick. Genetic information revealed by diagnostic testing of a patient or person at risk can be used for further diagnosis and choice of medical treatment if possible.

In some cases, patient organizations urge the development of better tests, or the availability of diagnosis and subsequent medication, such as in the microarray for breast cancer types and herceptin medication in the Netherlands (Verhoeff et al. 2008). The Dutch foundation representing patients with familial hypercholesterolemia also negotiates with insurance companies on the conditions for admittance to life insurances, and cooperates with healthcare organizations in retrieving potential new patients.

The use and availability of genetic tests in screening healthy individuals (pre-symptomatic testing) is much more of a public affair. In the Netherlands, severe criteria are used to determine which tests to include in government screening programs. These are based on the criteria formulated by Wilson and Jungner (1968), which prescribe that the test should detect a serious health problem, that the person tested should profit from the test and that options for prevention of serious effects and reliable diagnosis methods are available. However, government restrictions on the genes to be tested are not always accepted by stakeholders. Future parents express their wish to be informed on the risk of diseases, whether treatable or not (Plass et al. 2010). Government restrictions can be viewed as a form of paternalism. The commercial offering of tests for which no medical or government consent is required provides possibilities for individual choices. On the other hand, discussion arises as to whether the government should regulate this practice (Beaudet 2010).

In the Netherlands, the government is even more careful in taking initiatives in pre-conception screening, in which future parents can be informed about their carrier status for severe diseases. A political discussion arose concerning the practice of pre-implantation genetic diagnosis (PGD) (Lammens et al. 2009). Parents with a history of severe genetic diseases can test embryos created through IVF before implantation on gene variants for diseases such as Huntington disease and cystic fibrosis. These gene variants predict the disease with a very high degree of reliability. The intention to include BRCA-1 and BRCA-2 (alleles involved in breast cancer predisposition) in PGD testing met with political protest based on the argument that prediction of breast cancer from these genes was high but not 100 %. This could open the way for testing other gene variants with still lower percentages of prediction.

However, pre-conception screening is not subject to government decisions. In the USA, commercial firms such as Counsyl ( offer pre-conception screening for a host of diseases, and in the Netherlands, one university hospital has offered pre-conception screening for cystic fibrosis since May 2011. Insurance firms can become a stakeholder in this facility; at least one insurance firm offers to pay the cost for this pre-conception screening. Insurance companies also play another role by requiring genetic testing when genetic conditions run in the family. Political discourse on these issues has led to legislation in different countries on this use of genetic information (Rothstein and Joly 2009). Requests for testing can also be made by employers or—in the case of elite sports—by sports organizations or colleges. This again can become the subject of government regulation such as laws prohibiting selection based on genetic differences.

We can conclude that some of the soft impacts of genetic testing do already play a role in the political arena and that they are sometimes not so soft. Medical institutions, governments, patient stakeholders, insurance companies, employers and commercial firms offering genetic tests do not so much influence the development of genetics as their use and availability.

4 Revisiting Classroom Discourses on Genetic Testing

Instrumentalist and determinist attitudes towards technological innovations will be recognizable in classroom discourses by ‘discussion stoppers’ such as ‘It is everybody’s own choice whether or not to use this technology’ and ‘Technological advancement cannot be stopped/influenced.’ In order to improve classroom discourses, an STS-informed educational design should clarify how society and technological innovations are intertwined and influence each other. An instrumentalist attitude revealed by the discussion stopper ‘It is everybody’s own choice whether or not to use this technology’ can be countered by bringing into discussion (often not foreseen) soft impacts of the technology, especially examples of moral changes induced by the technology. Education on technological innovation should therefore distinguish between hard and soft impacts, and make clear that these impacts are related to values. Hard impacts are mostly related to uncontested values such as safety or health, whereas soft impacts are related to less tangible values such as respect for autonomy, naturalness and equal treatment.

A determinist attitude revealed by the discussion stopper ‘Technological advancement cannot be stopped/influenced’ can be challenged by discussing ways in which stakeholders have agency and can influence if and how the technology in question is used. Opening up their minds and informing them is the first step in equipping students for the future. Future scenarios used in STS research for anticipatory governance of techno-moral change have imaginative power and can also be used in classrooms to stimulate thinking ahead and critical reflection.

In short, from an STS perspective, socio-scientific learning can be enhanced by
  • raising awareness of ‘discussion stoppers’ and countering them;

  • uncovering instrumentalist and determinist attitudes to technology in students;

  • demonstrating co-evolution of science and technology and society (‘science and technology in society’);

  • distinguishing between hard and soft impacts of technology and clarification of underlying values; and

  • using scenarios to imagine techno-moral change and thus make it more tangible and accessible to critical reflection.

Our analysis of genetic testing from an STS perspective shows that genetic testing has both important hard and soft impacts. The hard impacts are mostly positive, whereas the soft impacts tend to be negative, such as the stress caused by false-positive results, limited access to careers in the case of genetic risk, increased demands for genetic testing and subsequent prevention measures, changes in what is considered healthy and responsible behavior, and the psychosocial implications of knowing your genetic risk status. In addition, variation in genetic disease frequency among ethnic groups may bring ethnicity issues into the debate. Values involved in hard impacts are health and safety in terms of preventing disease or death by modifying one’s lifestyle to counter genetic risk. Values involved in soft impacts are respect for autonomy, naturalness (in contrast to medicalization) and equal treatment in cases of being at risk or membership of an ethnic group.

Many societal stakeholders are involved in genetic testing such as biomedical researchers, clinical geneticists, midwives, patient organizations and pharmaceutical companies. Stakeholder agency focuses on the development and/or availability of a genetic test. Furthermore, agency is related to questions such as who decides on the obligation of a test and who makes the decision based on the test results. Introduction of stakeholders into the debate on genetic testing is important as most students are not aware of different stakeholders and their various interests.

4.1 Narratives on Hard and Soft Impacts in the Classroom

The use of narratives exemplifying hard and soft impacts is a powerful teaching method. By presenting both types of impact, students are stimulated to consider different perspectives on genetic testing and they become aware of the complexity of the issue (Boerwinkel and Waarlo 2011). These conflicting perspectives are an example of ‘epistemological disturbances,’ which are an effective way of introducing students to other perspectives (Simonneaux 2011). Narratives are helpful in achieving empathetic involvement. In addition, narrative modes of thought can enrich logico-scientific modes in argumentation on socio-scientific issues (Levinson 2006). Narratives can be real life, for example in direct contact with patients (Tal et al. 2011), videotaped incidents, or fictional, such as in realistic fiction stories or even science fiction (Knippels et al. 2009).

Cases should be selected carefully (Boerwinkel et al. 2011). Apart from making both hard and soft impacts visible, a case should demonstrate that stakeholders (can) differ in interests and perspectives. The case should demonstrate that decisions often involve conflicting values such as respect for autonomy versus prevention of harm. Cases for narratives can be sought for specifically in practices that have started recently and foreshadow probable future developments. In this way, the case is both realistic and a start for a future scenario. Good examples of these cases are testing in elite sport and pre-conception screening.

In elite sport, physical demands are extreme, and extreme financial investments are made for individual athletes. This means that, in elite sport, technological developments are visible in an earlier stage than in society in general (McNamee et al. 2009). Pre-conception screening for couples without a medical history is another example. In some countries, this is already standard practice, but in other countries this is a new development. In general, it can be expected that the possibilities for screening will increase, leading to new options and responsibilities. We have developed educational material to make students aware of the impact of genetic testing in elite sport.

Hard and soft impacts in elite sport

After a short introduction on physical testing in elite sport, genetic testing is presented as a new possibility. Students are invited to indicate on a scale of 1–5 whether they think genetic testing in elite sport is a good idea (1 = a very bad idea, 5 = a very good idea). They are confronted with a real-life narrative containing a YouTube clip in which a football player collapses on the field, caused by a condition called hypertrophic cardiomyopathy. This condition is unfortunately not very rare, and several genes contribute to this condition (Spinney 2004). After this clip, most students revise their opinions on genetic testing in elite sport and score much higher on the scale. Next, they watch another clip, this time of the 100 m breaststroke swimming finale during the Olympic games in Sidney. Unlike their expectations, nobody collapses in this race. They are informed that the winner of this finale was afterwards diagnosed with the same condition as the soccer player in the first clip, and therefore excluded from the Olympics in 2004. Students reconsider their opinion, and this time have serious doubts about genetic testing, especially when the decision to test or to stop is not left to the athlete but to a sports organization. Afterwards, they start a discussion on whether a regime should be installed in which genetic testing for athletes should be obligatory. In the end, they have arguments from different perspectives, which they use to substantiate their position on genetic testing in elite sport (Boerwinkel et al. 2011).

Trialling this educational strategy in the classroom revealed that it was effective in raising awareness of the complexity of the dilemma. Students mentioned the positive hard impact of testing in terms of prevention of harm. They also realized the negative soft impact in terms of violating respect for autonomy. The different interests and responsibilities of stakeholders such as the sports clubs were also brought up. Sports clubs pay large sums of money for players. In addition, they can be held responsible when they knowingly line up a player who is at risk. Insurance firms might require a test for this condition when the applicant enters a profession with high physical demands. Finally, the athletes are stakeholders, because they might be obliged to stop their career. All in all, students began to realize that an available genetic test is not merely a neutral instrument to be used at will. Some changed their opinion on what is an acceptable risk and what is morally right to do, and thus experienced techno-moral change.

The earlier described case of testing college athletes in the USA for sickle cell trait shows even more clearly that the possibility of genetic testing not only changes perceptions of what is acceptable but also changes the autonomy and roles of the different stakeholders. Before the lawsuit, athletes could decide on participating, colleges could decide on their policy and coaches could decide on who to choose and how to train. After the lawsuit, colleges and athletes are obliged to test, and coaches are informed on the genetic condition of their players, which they have to take into account (Jordan et al. 2011). After discussing these kinds of cases, the instrumentalist discussion stopper ‘It is everybody’s own choice and responsibility to test’ cannot be held up; often other people are involved or the decision is taken out of your hands.

In the lesson on genetic testing in elite sport, students are also presented with a ‘genetic gift certificate.’ This certificate contains a list of conditions for which they can be tested, for instance ‘a severe disease that can be prevented’ or ‘genetic conditions that indicate the best-fitting type of exercise.’ Students are invited to tick the conditions they want to be informed of through a free genetic test. After they have done this, the advantages and disadvantages of genetic knowledge are discussed. Generally, three groups of students appeared: some students wanted to be fully informed, other students did not want to know any genetic information and some selected specific genetic information. Students who declined any genetic information mostly indicated soft impacts such as ‘It makes your life less your own’ and ‘You will worry all the time.’ Students who wanted to know all information stated that information is always advantageous as you can use it to adapt your life style.

Interesting discussions followed when the question arose as to whether or not they would want to know the genetic conditions of their children. On being asked, more students wanted to know this, as they see this as their responsibility as a parent. Feeling responsible for your children’s health can incite people to use newly available genetic tests. It should be noted that the instrumentalist attitude expressed in saying ‘It is everybody’s own choice and responsibility to test’ is being contradicted. Here, we have an example of ‘can implies ought,’ as mentioned before in the context of techno-moral change. It may still be your own choice to opt for a genetic test, but this choice is influenced by social pressure. It is also influenced by commercial advertisements for genetic testing such as ‘Protect your baby from 100 + genetic diseases with a simple test, even before pregnancy’ ( Who would refuse?

4.2 Prediction Reliability and False Test Results

How reliable are genetic tests and what do test results mean? Genetic information tends to be taken very seriously, both because genes are often considered as the ultimate cause that cannot be influenced (‘genetic determinism’), and because DNA analysis seems to provide exact results, especially in TV series on crime investigation. Disease prevention by genetic testing may be quantifiable on the societal level, but on the individual level, genetic predictions have inherent uncertainty. The presence of gene variants associated with disease indicates a risk but seldom predicts exactly what will happen and when. This is demonstrated by the case of hypertrophic cardiomyopathy for which the presence of gene variants can in one case cause early death but in another allows the carrier to win an Olympic race. Causes of this uncertainty are the fact that often several genes are involved, and that the same gene variant can have different effects in different individuals. Another uncertainty concerns false-positive and false-negative results of testing. Many genetic tests such as the heel prick are not directed at DNA but measure metabolites associated with the disease. As deviant levels of metabolites can also have causes other than the gene variant, false-positive results are common. A second test is then needed to ascertain the genetic cause. Explaining the causes and nature of false-positive results is important, as students often assume that a test directly measures the presence or absence of a disease. As mentioned before, false-positive results can cause unnecessary anxiety and subsequent further testing, which can be costly and may entail risks. In this way, both types of uncertainty of genetic information have soft impacts.

False-positive and false-negative test results

Important concepts in evaluating the quality of a genetic test are specificity and sensitivity. The specificity indicates to what extent positive results of the test indicate the disease. A test with low specificity will produce many false-positive results. The sensitivity indicates to what extent disease cases are detected by the test. A test with low sensitivity will produce many false-negative results. Ideally, a test has both 100 % specificity and 100 % sensitivity, but in reality this is never the case. However, students often think that testing is identical to identifying a disease. A way to discuss this with students is to use realistic test data, for example on congenital hypothyroidism (Table 2). In this case, the test was done with 179,800 children, of whom 300 tested positive. Further testing of these children found 60 disease cases. Two cases were not detected by the test. Calculating with these figures and discussing the outcomes help students to think about whether a new test should be designed with a better specificity (240 out of 300 proved to be false positive), often with the side effect that the sensitivity will be lower and more cases will remain undetected. They will realize that often there is a trade-off in advantages and disadvantages, and that the decision can differ in each case, depending among other factors on the seriousness of the disease and the negative effects of false-positive results.
Table 2

False-positive and false-negative results in testing for congenital hypothyroidism

Test result

















4.3 Preparing for Agency

As mentioned above, students express both determinist and instrumentalist attitudes towards technology, which do not conform to the actual intertwinement of technology and society. Stakeholders can impact on the development and implementation of genetic tests. Therefore, students’ attitudes should be countered by using examples of stakeholder participation such as the concerted action of people with the same interests when it comes to putting forward their demands. Good examples of agency are the debates and lawsuits on the patenting of the BRCA-1 gene. The firm Myriad Genetics claims a patent on all medical actions based on the sequence of gene mutations it discovered (Pollack 2012). Many groups of breast cancer patients have cooperated internationally to oppose this claim. In education on genetic testing, this example can be used to demonstrate that you can refuse to accept a given situation and that you have to empower yourself. Students can learn that in society a just and healthy situation is not automatically realized but requires alertness and active lobbying for rights. Thus, they should also acquire insight into the process of societal decision-making. The case of sickle cell anemia testing, as described above, provides another illustrative narrative.

4.4 Imagining the Future Using Techno-Moral Vignettes

Human beings constantly imagine the future and then act on it. So why not purposefully and critically think ahead with creativity about what technology might bring? In STS studies, future scenarios are a way to support policy makers and the public in co-shaping the future. How can we involve and engage students? Thought-provoking short stories describing possible futures for genetic testing in our society and our lives seem promising in this respect. These stories are based on current developments in the field but should not be read as predictions of the future. They invite the reader to imagine and consider ways in which genetic testing might change our world, ideas, values and ideals. We call them ‘techno-moral vignettes’ because they explore both different fields of application for genetic testing and the moral sensitivities and concerns that might be raised by these future applications. The vignettes can be used as a source for debate, starting with a few simple but important questions. What exactly are the issues raised in the vignette and what has changed in the world the vignette describes? What do you think of the issues described? Which person or argument in the story do you like most or do you see as most controversial and why? Is this indeed a future in which you would like to live? What should be done in the situation the vignette describes and who do you see as most responsible? Is there a role for politics to play? Part of a vignette on susceptibility testing is presented below (Stemerding et al. 2010).

Techno-moral vignette on susceptibility testing

… In 2017, two Dutch genome researchers published a striking proposal in the Dutch Journal of Medicine. Genetic susceptibility testing, based on microarrays including a large number of genetic markers, was becoming a real possibility and thus, the authors argued, might be used as a tool in targeting individuals at risk for developing type 2 diabetes. This would entail a more cost-effective approach than targeting all individuals sharing a particular family history. It would also resolve some of the more delicate questions about how to justify the inclusion of healthy relatives and children in such a program. And, even more important, in stimulating individuals to adopt a healthy lifestyle and to commit themselves to regular monitoring, an individualized genetic risk profile might be more effective than the more general risk information based on family history.

The proposal provoked an immediate response from some well-known clinical geneticists. They sympathized with the idea of personal genetic information as a valuable resource in individual decision-making but also voiced serious reservations. Whereas the personalized character of risk information was indeed a source of empowerment in matters of health and lifestyle, its genetic nature, on the other hand, might just as well cause fatalism. There was also the danger of misunderstanding a ‘good’ genetic profile as a license to continue an unhealthy lifestyle. Clinical geneticists knew all too well how difficult it is for individuals to deal with complex genetic information. The introduction of genetic susceptibility screening in a program of prevention thus implied, as the authors emphasized, a tremendous challenge in terms of counseling. Would the costs of such an effort really justify the benefits? These considerations obviously got the ear of the health insurer involved in the program….

Characteristics of a techno-moral vignette are, among others,:
  • they are not too far in the future, so a plausible technological development is foreseen;

  • the application of this technology leads to impacts that cause societal debate;

  • the impacts include changes in routines and in what is considered normal and healthy;

  • stakeholders react in a way comparable to reactions to similar situations in the past.

Meeting these characteristics will result in a plausible future to be foreseen in a rather detailed way. Stakeholders in the case of genetic testing are researchers, clinical geneticists, physicians, insurance firms, the Ministry of Health and, of course, individuals caring about their health.

The question of how to adapt these vignettes for classroom use is now being studied. Students do not have to make policy decisions for which the vignettes were designed, but they should have a good image of possible futures and their role in them. Future scenarios have been used in environmental education (Lloyd and Wallace 2008) as a way of making students aware of their possible role in future developments. Other educational designs have effectively used science fiction as a way of stimulating students to produce pros and cons about possible future developments (Knippels et al. 2009).

5 Revisiting STS Education

In earlier STS education approaches, technology had a main role in problem-solving strategies (Pedretti and Nazir 2011) and has been given a role equal to but distinct from science (Yager 1992). Earlier STS education often frames technological influences on society in a risk context in which hard impacts are discussed, such as the risks of nuclear power and genetic recombination. Critical remarks on this STS approach in education were that these problems are quite removed from everyday personal experience and therefore less relevant to students (Shamos 1995) and that this perspective remained within a ‘hard science’ framework that does not give room for ethical choices and values (Hughes 2000). More recent approaches tend to neglect technology as a practice influenced by society and influencing society. Although ‘science in the making’ is considered a crucial element of STS education (Bingle and Gaskell 1994), ‘technology in the making’ is seldom mentioned in educational contexts. A recent publication on socio-scientific issues in the classroom contains only one specific discussion on a particular technology, in this case reproductive technology (Dawson 2011).

A contemporary STS perspective on socio-scientific issues might stimulate an approach that gives explicit attention to the interplay between technology and society with an emphasis on soft impacts. This could bring technology back into the discourse at the interface between science and society, while focusing on the effects on everyday life and ethical issues. STS also provides a broader view on risk. It demonstrates that the attempt to exclude all risks brings about other risks, which can have very negative consequences. Instead of excluding all risks, we might learn to deal with them better. In addition, moral reflection and reasoning might be enriched by demonstrating that moral judgments of situations and actions can be influenced by what is possible, and thus by technological developments.

Agency is another element seldom explicitly addressed in science education aiming at citizenship. Agency does play a role in education for sustainable development (Pedretti and Nazir 2011; Van Eijck and Roth 2007), but the question of how to act upon decisions made often remains open. The refutation in science education of instrumentalist and determinist attitudes as revealed by taking an STS perspective could contribute to a better image of possible actions. It would be worthwhile studying the role of co-production of technology and society further and discussing whether the model of socio-scientific elements of functional scientific literacy (Zeidler et al. 2005) and the concept of socio-scientific reasoning (Sadler et al. 2007) should be adapted to include the co-production of technology and society.

Interventionist STS research has used future scenarios on several emerging technologies, partly to advise government policy and enhance public deliberation on these issues (Swierstra and Te Molder 2012). These scenarios could be adapted to and used in educational settings (cf. Stemerding et al. 2010). As these scenarios are situated in a future in which the students will be adult citizens, these scenarios are relevant for them. As well as using available scenarios, students can be invited to imagine possible futures. This can be a good way to assess prior knowledge, to discuss moral values and to introduce political choices (Lloyd and Wallace 2008).

Discussing soft impacts and related values with students demands specific skills of science teachers. As well as specific knowledge being needed on the interaction of society and technology, clarification and reflection on values with students demands specific strategies for which teachers must be prepared. This means that materials and training activities must be developed.

6 Conclusions and Outlook

Technology development seems underexposed in current science education research on socio-scientific learning. In this interdisciplinary analytical essay, we have outlined what STS research has to say about technology at the interface of science and society. Technology and society are no longer seen as independent entities (technology and society), but rather as shaping each other (technology in society). Nevertheless, public debates on technological innovations still suffer from an imbalance in terms of overemphasizing the risks at the cost of an open eye for soft impacts of technology.

With regard to genetic testing, soft impacts concern the anxiety evoked by false-positive test results, limited access to particular jobs due to genetic risk, the increased demand of genetic testing and subsequent medical care due to commercialization, changes in the conception of health and the perception of responsibility to one’s health, and psychosocial effects of knowing one’s genetic status. Bringing these soft impacts into discussion may enable students to put the positive hard impact of disease prevention into perspective.

Raising awareness of the societal influences on the development, use and availability of genetic tests is a first step in enhancing student agency. Bringing up for discussion the influence of technology on morality will contribute to scrutinizing the prevailing idea that technology is constrained by ethics. Furthermore, addressing the uncertainty in genetic predictions in classroom discourses seems helpful in developing a realistic view of science and technology and life.

Interventionist STS research has provided tools such as scenario studies for emerging technologies that seem appropriate for classroom use to practice dialogue and participative decision-making. To keep a dialogue going, it is important to challenge discussion stoppers such as ‘It is everybody’s own choice’, ‘Playing God’ or ‘We can’t stop technological advancement’ by uncovering underlying assumptions. The latter implies that severe demands are made on teachers in terms of pedagogical skills. The ideas developed in this essay are now being tested empirically in educational design studies of our research group on genomics education.


  1. 1.

    See, for instance, Abma and Broerse (2010), Dijkstra (2008), Hanssen (2009), Kent (2011).

  2. 2.

    See, for example, Ellul, Heidegger, Mumford and others.

  3. 3.

    Bijker and Law (1992), Bijker et al. (1987), Feenberg (1991), Latour (1992), Winner (1980).

  4. 4.

    Marris (2001), Wynne (1996, 2001, 2006).



This work was supported by grants from the Centre for Society and the Life Sciences and the Cancer Genomics Centre, both genomics centres of excellence of the Netherlands Genomics Initiative (NGI) / Netherlands Organisation for Scientific Research (NWO). The authors would like to thank the reviewers for their valuable comments and suggestions to improve the manuscript.

Open Access

This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.


  1. Abma, T. A., & Broerse, E. W. (2010). Patient participation as dialogue: Setting research agendas. Health Expectations, 13, 160–173.CrossRefGoogle Scholar
  2. Beaudet, A. L. (2010). Which way for genetic-test regulation? Leave test interpretation to specialists. Nature, 466, 816–817.CrossRefGoogle Scholar
  3. Bijker, W., & Law, J. (Eds.). (1992). Shaping technology/building society. Studies in sociotechnical change. Cambridge, MA: MIT Press.Google Scholar
  4. Bijker, W. E., Pinch, T. J., & Hughes, T. (Eds.). (1987). The social construction of technological systems. New directions in the sociology and history of technology. Cambridge, MA: MIT Press.Google Scholar
  5. Bingle, W. H., & Gaskell, P. J. (1994). Scientific literacy for decision making and the social construction of scientific knowledge. Science Education, 78(2), 185–201.CrossRefGoogle Scholar
  6. Boenink, M., Swierstra, T. & Stemerding, D. (2010). Anticipating the interaction between technology and morality: A scenario study of experimenting with humans in bionanotechnology. Studies in Ethics, Law, and Technology, 4(2). doi: 10.2202/1941-6008.1098.
  7. Boerwinkel, D. J., Knippels, M.-C., & Waarlo, A. J. (2011). Raising awareness of pre-symptomatic genetic testing. Journal of Biological Education, 45(4), 213–221.CrossRefGoogle Scholar
  8. Boerwinkel, D. J. & Waarlo, A. J. (Eds.) (2011). Genomics education for decision-making. In Proceedings of the second invitational workshop on genomics education, Dec 2–3, 2010, Utrecht, The Netherlands. Utrecht: Utrecht University, Freudenthal Institute for Science and Mathematics Education (FIsme Scientific Library, No. 67).
  9. Bucchi, M. (2008). Of deficits, deviations and dialogues. Theories of public communication of science. In M. Bucchi & B. Trench (Eds.), Handbook of public communication of science and technology (pp. 57–76). London/New York: Routledge.Google Scholar
  10. Clarke, A. E., Shim, J., Shostak, S. & Nelson, A. (2009). Biomedicalising genetic health, diseases and identities. In P. Atkinson, P. Glasner, & M. Lock (Eds.), Handbook of genetics and society (pp. 21–40). London: Routledge.Google Scholar
  11. Collingridge, D. (1980). The social control of technology. London: Pinter Publishers.Google Scholar
  12. Dawson, V. M. (2011). A case study of the impact of introducing socio-scientific issues into a reproduction unit in a Catholic girls’ school. In T. D. Sadler (Ed.), Socio-scientific issues in the classroom. Contemporary trends and issues in science education, Vol. 39. Dordrecht: Springer.Google Scholar
  13. Dijkstra, A. M. (2008). Of publics and science. How publics engage with biotechnology and genomics. Thesis, University of Twente, The Netherlands.Google Scholar
  14. Dillon, J. (2011). Science communication—a UK perspective. International Journal of Science Education, part B, 1(1), 5–8.CrossRefGoogle Scholar
  15. Elias, S., & Annas, G. J. (1994). Generic consent for genetic screening. New England Journal of Medicine, 330(22), 1611–1613.CrossRefGoogle Scholar
  16. Ellul, J. (1964). The technological society. New York: Alfred A. Knopf.Google Scholar
  17. Engel, G. L. (1988). How much longer must medicine’s science be bound by a seventeenth century world view. In K. White (Ed.), The task of medicine: Dialogue at Wickenburg (pp. 113–116). Menlo Park: The Henry Kaiser Foundation.Google Scholar
  18. Feenberg, A. (1991). Critical theory of technology. Oxford: Oxford University Press.Google Scholar
  19. Gaskell, G., & Bauer, M. W. (Eds.). (2001). Biotechnology 1996–2000: The years of controversy. London: Science Museum.Google Scholar
  20. Godard, B., Raeburn, S., Pembrey, M., Bobrow, M., Farndon, P., & Ayme, S. (2003). Genetic information and testing in insurance and employment: Technical, social and ethical issues. European Journal of Human Genetics, 11, S123–S142.CrossRefGoogle Scholar
  21. Greely, H. T. (2011). Get ready for the flood of fetal gene screening. Nature, 469, 289–291.CrossRefGoogle Scholar
  22. Hanssen, L. (2009). From transmission toward transaction. Design requirements of successful public participation in communication and governance of science and technology. Thesis, University of Twente, The Netherlands.Google Scholar
  23. Health Council of the Netherlands. (2008). Screening: Between hope and hype. The Hague: Health Council of the Netherlands, publication no. 2008/05E.Google Scholar
  24. Health Council of the Netherlands. (2010). The ‘thousand-dollar genome’: An ethical exploration. Monitoring report ethics and health, 2010/2. The Hague: Centre for Ethics and Health, Health Council of the Netherlands, publication number 2010/15E.Google Scholar
  25. House of Lords Select Committee on Science and Technology. (2000). Science and technology—third report. London: HMSO.Google Scholar
  26. Hughes, G. (2000). Marginalization of socio-scientific material in science-technology-society science curricula: Some implications for gender inclusivity and curriculum reform. Journal of Research in Science Teaching, 37, 426–440.CrossRefGoogle Scholar
  27. Hume, D. (1987/1740). In E. C. Mossner (Ed.), A treatise on human nature. New York: Penguin.Google Scholar
  28. Jasanoff, S. (Ed.). (2004). States of knowledge: The co-production of science and social order. New York: Routledge.Google Scholar
  29. Joly, P. & Marris C. (2001). Agenda-setting and controversies: A comparative approach to the case of GMOs in France and the United States. For the Workshop on European and American Perspectives on Regulating GE Food, INSEAD, Fontainebleau, June.
  30. Jordan, L. B., Smith-Whitley, K., Treadwell, M. J., Telfair, J., Grant, A. M. & Ohene- Frempong, K. (2011). Screening U.S. college athletes for their sickle cell disease carrier status. American Journal of Preventive Medicine, 41(6), 406–412.Google Scholar
  31. Kark, J. A., Posey, D. M., Schumacher, H. R., & Ruehle, C. J. (1987). Sickle-cell trait as a risk factor for sudden death in physical training. New England Journal of Medicine, 317(13), 781–787.CrossRefGoogle Scholar
  32. Kent, A. (2011). Relation with public interest organisations: patients and families. In D. J. Bennett & R. C. Gennings (Eds.), Successful science communication. Telling it like it is (pp. 196–203). Cambridge: Cambridge University Press.Google Scholar
  33. Keulartz, J., Schermer, M., Korthals, M., & Swierstra, T. (2004). Ethics in a technological culture. A programmatic proposal for a pragmatist approach. Science, Technology and Human Values, 29(1), 3–29.CrossRefGoogle Scholar
  34. Knippels, M.-C., Severiens, S. E., & Klop, T. (2009). Education through fiction: Acquiring opinion-forming skills in the context of genomics. International Journal of Science Education, 31(15), 2057–2083.CrossRefGoogle Scholar
  35. Lammens, C., Bleiker, E., Aaronson, N., Vriends, A., Ausems, M., Jansweijer, M., et al. (2009). Attitude towards pre-implantation genetic diagnosis for hereditary cancer. Familial Cancer, 8(4), 457–464.CrossRefGoogle Scholar
  36. Lanting, C. I., Rijpstra, A., Breuning-Boers, J. M. & Verkerk, P. H. (2008). Evaluatie van de neonatale hielprikscreening bij kinderen geboren in 2007. TNO rapport KvL P&Z 2008.119. TNO Kwaliteit van Leven.Google Scholar
  37. Latour, B. (1992). Where are the missing masses? The sociology of the new mundane artifacts. Shaping technology, building society. Cambridge, MA: MIT Press.Google Scholar
  38. Levinson, R. (2006). Towards a theoretical framework for teaching controversial socio- scientific issues. International Journal of Science Education, 28(10), 1201–1224.CrossRefGoogle Scholar
  39. Lloyd, D., & Wallace, J. (2008). Imaging the future of science education: the case for making futures studies explicit in student learning. Studies in Science Education, 40(1), 139–177.CrossRefGoogle Scholar
  40. Marris, C. (2001). Public views on GMOs: Deconstructing the myths. EMBO Reports, 2(7), 545–548.CrossRefGoogle Scholar
  41. McNamee, M. J., Müller, A., van Hilvoorde, I., & Holm, S. (2009). Genetic testing and sports medicine ethics. Sports Medicine, 39(5), 339–344.CrossRefGoogle Scholar
  42. Meijers, A. (2009). Philosophy of technology and engineering sciences. Amsterdam: Elsevier.Google Scholar
  43. Messner, D. A. (2011). Informed choice in direct-to-consumer genetic testing for Alzheimer and other diseases: Lessons from two cases. New Genetics and Society, 30(1), 59–72.CrossRefGoogle Scholar
  44. Mitcham, C. (1994). Thinking through technology: The path between engineering and philosophy. Chigaco: The University of Chicago Press.Google Scholar
  45. Nelis, A. (1998). DNA-diagnostiek in Nederland. Een regime-analyse van de ontwikkeling van de klinische genetica en DNA-diagnostische tests, 19701997. Enschede: Twente University Press.Google Scholar
  46. Nowotny, H., Scott, P., & Gibbons, M. (2001). Rethinking science. Knowledge and the public in an age of uncertainty. Cambridge: Polity.Google Scholar
  47. Oudshoorn, P., & Pinch, T. (Eds.). (2003). How users matter: The co-construction of users and technology. Boston: MIT Press.Google Scholar
  48. Pedretti, E., & Nazir, J. (2011). Currents in STSE education: Mapping a complex field, 40 years on. Science Education, 95(4), 601–626.CrossRefGoogle Scholar
  49. Plass, A. M. C., Van El, C. G., Pieters, T., & Cornel, M. C. (2010). Neonatal screening for treatable and untreatable disorders: Prospective parents’ opinion in the Netherlands. Pediatrics, 125(1), e99–e106.CrossRefGoogle Scholar
  50. Pollack, A. (2012). Justices send back gene case. New York Times, March 26.Google Scholar
  51. Postman, N. (1992). The judgment of Thamus. In Technopoly: The surrender of culture to technology (pp. 3–20). New York: Vintage.Google Scholar
  52. Radstake, M., Nelis, A., Van den Heuvel-Vromans, E., & Dortmans, K. (2009). Mediating online DNA-dialogues. From public engagement to interventionist research. Science Technology & Innovation Studies, 5, 37–47.Google Scholar
  53. Rip, A., & Kemp, R. (1998). Technological change. In S. Rayner & E. L. Malone (Eds.), Human choice and climate change (pp. 327–399). Columbus, OH: Battelle Press.Google Scholar
  54. Rothstein, M. A. & Joly, Y. (2009). Genetic information and insurance underwriting. Contemporary issues and approaches in the global economy. In P. Atkinson, P. Glasner, & M. Lock (Eds.), Handbook of genetics and society (pp. 127–144). London: Routledge.Google Scholar
  55. Sadler, T. D., Barab, S. A., & Scott, B. (2007). What do students gain by engaging in socio-scientific inquiry? Research in Science Education, 37(4), 371–391.CrossRefGoogle Scholar
  56. Science Daily. (2010). New sickle cell screening program for college athletes comes with serious pitfalls, experts say. Science Daily, Sept 8.Google Scholar
  57. Shamos, M. H. (1995). The myth of scientific literacy. New Brunswick, NJ: Rutgers University Press.Google Scholar
  58. Shelley-Egan, C. (2011). Ethics in practice: responding to an evolving problematic situation of nanotechnology in society. Dissertation, Twente University, Enschede.Google Scholar
  59. Simonneaux, L. (2011). The reasoned arguments of a group of future biotechnology technicians on a controversial socio-scientific issue: human gene therapy. Journal of Biological Education, 45(3), 150–157.CrossRefGoogle Scholar
  60. Smith, M. R., & Marx, L. (Eds.). (1994). Does technology drive history? The dilemma of technological determinism. Cambridge, MA: MIT Press.Google Scholar
  61. Spinney, L. (2004). Heart-stopping action. Nature, 430, 606–607.CrossRefGoogle Scholar
  62. Stemerding, D., Swierstra, T., & Boenink, M. (2010). Exploring interaction between technology and morality in the field of genetic susceptibility testing: A scenario study. Futures, 42(10), 1133–1145.CrossRefGoogle Scholar
  63. Swierstra, T., & Rip, A. (2007). Nano-ethics as NEST-ethics: Patterns of moral argumentation about new and emerging science and technology. NanoEthics, 1(1), 3–20.CrossRefGoogle Scholar
  64. Swierstra, T., Stemerding, D. & Boenink, M. (2009). Exploring techno-moral change. The case of the obesity pill. In P. Sollie & M. Düwell (Eds.), Evaluating new technologies. The international library of ethics, law and technology (Vol. 3, pp. 119–138). Dordrecht: Springer.Google Scholar
  65. Swierstra, T. & Te Molder, H. (2012). Risk and soft impacts. In S. Roeser, R. Hillerbrand, M. Peterson & P. Sandin (Eds.), Handbook of risk theory. Epistemology, decision, theory, ethics and social implications of risk (pp. 1050–1066). Dordrecht: Springer.Google Scholar
  66. Tal, T., Kali, Y., Magid, S., & Madhok, J. J. (2011). Enhancing the authenticity of a web-based module for teaching simple inheritance. In T. D. Sadler (Ed.), Socio-scientific issues in the classroom. Contemporary trends and issues in science education, Vol. 39. Dordrecht: Springer.Google Scholar
  67. Ten Have, H. (1995). Het gen als ziel van de mens. Wijsgerig Perspectief, 37(1), 29–30.Google Scholar
  68. Turkle, S. (2010). Alone together. Why we expect more from technology and less from each other. New York: Basic Books.Google Scholar
  69. Van de Poel, I. R. (2007). Ethics in engineering practice. In S. Hylgaard Christensen, M. Meganck, & B. Delahousse (Eds.), Philosophy in engineering. Academica: Aarhus.Google Scholar
  70. Van Eijck, M., & Roth, W. M. (2007). Improving science education for sustainable development. PLoS Biology, 5(12), 2763–2769.Google Scholar
  71. Van Lente, H. (1993). Promising technology: The dynamics of expectations in technological developments. Delft: Eburon.Google Scholar
  72. Van Mil, M. H. W., Boerwinkel, D. J., Buizer-Voskamp, J. E., Speksnijder, A., & Waarlo, A. J. (2010). Genomics education in practice: Evaluation of a mobile lab design. Biochemistry and Molecular Biology Education, 38(4), 224–229.CrossRefGoogle Scholar
  73. Verbeek, P. (2005). What things do—philosophical reflections on technology, agency and design. Pennsylvania: Penn State University Press.Google Scholar
  74. Verhoeff, R., Moors, E. H. M., & Osseweijer, P. (2008). Interactive communication in pharmacogenomics innovations. Genomics, Society and Policy, 4(2), 53–69.CrossRefGoogle Scholar
  75. Verhoeff, R. P., & Waarlo, A. J. (2011). Good intentions, stubborn practice: A critical appraisal of a public event on cancer genomics. International Journal of Science Education, Part B, 10. doi: 10.1080/21548455.2011.610573.
  76. Vos, R. (2008). Genetic risks and justice in the workplace: the end of the protection paradigm? In G. deVries & K. Horstman (Eds.), Genetics from laboratory to society. Societal learning as an alternative to regulation (pp. 155–170). Basingstoke: Palgrave MacMillan.Google Scholar
  77. Waarlo, A. J. (1998). Teaching and learning of informed decision-making on predictive genetic testing: a pilot study. In H. Bayrhuber & F. Brinkman (Eds.), What—why—how? Research in didaktik of biology (pp. 196–204). Kiel: Institut für Didaktik der Naturwissenschaften.Google Scholar
  78. Waarlo, A. J. (1999). Biology students’ forming and justifying of opinions on predictive genetic testing. Towards a practicable and effective teaching strategy. In M. Bandiera, S. Caravita, E. Torracca & M. Vicentini, M. (Eds.), Research in science education in Europe (pp. 41–48). Dordrecht/Boston/London: Kluwer Academic Publishers.Google Scholar
  79. Waelbers, K. (2011). Doing good with technologies. Dordrecht: Springer.CrossRefGoogle Scholar
  80. Winner, L. (1980). Do artifacts have politics? Daedalus, 109(1), 121–136.Google Scholar
  81. Wilson J. M. G. & Jungner, G. (1968). Principles and practice of screening for disease. Geneva: World Health Organization: Public Health Papers, #34.Google Scholar
  82. Wynne, B. (1996). Misunderstood misunderstandings. Social identities and public uptake of science. In A. Irwin & B. Wynne (Eds.), Misunderstanding science? The public reconstruction of science and technology (pp.19–46). Cambridge: Cambridge University Press.Google Scholar
  83. Wynne, B. (2001). Creating public alienation: Expert cultures of risk and ethics on GMOs. Science as Culture, 10(4), 446–481.CrossRefGoogle Scholar
  84. Wynne, B. (2006). Public engagement as a means of restoring public trust in science—hitting the note, but missing the music? Community Genetics, 9(3), 211–220.CrossRefGoogle Scholar
  85. Yager, R. E. (1992). Science-technology-society as reform. In R. E. Yager (Ed.), The status of science-technology-society: Reform efforts around the world. ICASE Yearbook 1992 (pp. 2–8). Hong Kong: International Council of Associations of Science Education.Google Scholar
  86. Zeidler, D. L., Sadler, T. D., Simmons, M. L., & Howes, E. V. (2005). Beyond STS: A research-based framework for socio-scientific issues education. Science Education, 89, 357–377.CrossRefGoogle Scholar
  87. Zwart, S. D., Van de Poel, I., Van Mil, H., & Brumsen, M. (2006). A network approach for distinguishing ethical issues in research and development. Science and Engineering Ethics, 12, 663–684.CrossRefGoogle Scholar

Copyright information

© The Author(s) 2012

Authors and Affiliations

  • Dirk Jan Boerwinkel
    • 1
    • 2
  • Tsjalling Swierstra
    • 3
  • Arend Jan Waarlo
    • 1
    • 4
  1. 1.Freudenthal Institute for Science and Mathematics EducationUtrecht UniversityUtrechtThe Netherlands
  2. 2.Cancer Genomics CentreUniversity Medical Centre UtrechtUtrechtThe Netherlands
  3. 3.Department of PhilosophyMaastricht UniversityMaastrichtThe Netherlands
  4. 4.Centre for Society and the Life SciencesRadboud UniversityNijmegenThe Netherlands

Personalised recommendations