Skip to main content
Log in

Assumptions of the Deficit Model Type of Thinking: Ignorance, Attitudes, and Science Communication in the Debate on Genetic Engineering in Agriculture

  • Articles
  • Published:
Journal of Agricultural and Environmental Ethics Aims and scope Submit manuscript

Abstract

This paper spells out and discusses four assumptions of the deficit model type of thinking. The assumptions are: First, the public is ignorant of science. Second, the public has negative attitudes towards (specific instances of) science and technology. Third, ignorance is at the root of these negative attitudes. Fourth, the public’s knowledge deficit can be remedied by one-way science communication from scientists to citizens. It is argued that there is nothing wrong with ignorance-based explanations per se. Ignorance accounts at least partially for many cases of opposition to specific instances of science and technology. Furthermore, more attention needs to be paid to the issue of relevance. In regard to the evaluation of a scientific experiment, a technology, or a product, the question is not only “who knows best?,” but also “what knowledge is relevant and to what extent?.” Examples are drawn primarily from the debate on genetic engineering in agriculture.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Other phrases employed include “cognitive deficit model of the public understanding of science” (Wynne 1991), “deficit model of public attitudes” (Sturgis and Allum 2004), “deficit model of science communication” (Hails and Kinderlerer 2003; Dickson 2005), “deficit model of public understanding of science” (Sturgis et al. 2005), “deficit model of understanding” (Royal Society 2004), “public deficit models” (Wynne 2006), and “deficit approach” (Bubela et al. 2009).

  2. Instances of the deficit model type of thinking no doubt go further back in time. See e.g., Wynne (2006, esp. 214).

  3. One might ask whether the persistently high percentage of wrong answers can be partially explained by the way the survey has formulated the particular question.

  4. A common response to the GMO controversy by many national (advisory) biotechnology boards has been an attempt to educate the public about genetic engineering in agriculture. Sometimes this has taken the form of producing information leaflets one after another. (By this I do not mean that basic informing through leaflets and various other popular science outlets would not be needed. It should also be noted that at least some of these boards have informing and educating the general public as part of their mission.) Certain decision-making practices may also uphold the deficit frame. It has been argued that the current public consultation practices on the deliberative release and placing on the market of GMOs in the EU fail to yield a genuine two-way communication between policymakers and the general public (Ahteensuu and Siipi 2009).

  5. As a response to Bonnie Wintle, Mark Burgman and Fiona Fidler’s criticism (2007, 327), Currall et al. (2007, 328) state that they do not take the stance that simply educating the public about science would lead to public acceptance of nanotechnology. For recent discussion about deficit model related ideas in the context of nanotechnology see e.g., Brown (2009), Jones (2008), Kahan et al. (2009); see also Currall (2008),Scheufele et al. (2008a, b).

  6. In this paper I use the term “development” in a neutral sense. It does not imply any evaluative stance or commitment.

  7. The 1985 report from the Royal Society in fact goes even further and suggests that the public understanding of science is vital to the future well-being of society and critical to a nation’s competitiveness. Scientist should consider promoting the public understanding of science as their responsibility.

  8. Needless to say science knowledge encompasses knowledge about scientific basis of technologies.

  9. In fact, Eurobarometers on biotechnology show some, but rather small, increase on the public’s knowledge on biology and genetics.

  10. Ignorance is typically closely connected to lack of interest. People do not seek information on issues that they are not interested in. However, the connection is contingent. Even if a person is highly interested in some subject, this does not always result in adequate knowledge or understanding of the subject matter in question. S/he might for example acquire information from sources that are not trustworthy or accurate, or alternatively misunderstand adequate information provided.

  11. Not all epistemologists agree with this. For discussion on the nature of and relationship between knowledge and understanding see e.g., Kvanvig (2003).

  12. When compared with the 2005 survey, a minor general shift towards skepticism can be observed.

  13. These kinds of distinctions may be useful for analytic purposes. In most instances of negative attitudes both types of opposition are present to a certain degree.

  14. A proponent of the deficit model might try to respond that relevant kinds of attitudes are concerned only with ignorance-based negative attitudes, not with all negative attitudes. This kind of response however misses the point. Although it may work in regard to the second assumption, the response undermines the explanatory strategy of the deficit model type of thinking. It is detrimental in regard to the third assumption. The response results in an undesirable kind of a tautology: ignorance is at the root of negative attitudes only when ignorance is at the root of negative attitudes. The fact that many explanations—in fact all deductive-nomological and deductive-statistical explanations—are tautologies in the strict sense does not help. If the explanandum (i.e., relevant kinds of negative attitudes) is narrowed down by definition, not made understandable by a reference to a (statistical) non-accidental regularity supported by empirical studies, the tautology in question becomes uninformative and non-explanatory.

  15. Although the argument from ignorance (argumentum ad ignorantiam), a fallacy in argumentation, has no direct bearing on the deficit model type of thinking, the normative perspective to ignorance-based attitudes comes close to it. The argument from ignorance takes many forms, and not all of them are problematic (see e.g., Walton 1992).

  16. European Environment Agency’s report Late Lessons from Early Warnings: The Precautionary Principle 18962000 (EEA 2001) examines fourteen case studies on taking no precaution in the state of uncertainty, and the serious consequences of this omission.

  17. It is rather clear that negative attitudes are not always associated with mistrust. I do not like cars, but I experience no mistrust of them. Some kinds of negative attitudes are typically present when someone mistrusts a thing or a person. Yet there are cases in which this does not hold. I might mistrust my friend as I know that s/he is a pathological liar even if I do not have any negative attitudes towards her/him (at least in the common sense of the word).

  18. The Royal Society of London for the Improvement of Natural Knowledge.

  19. Many authors who have published on the deficit model use a causal language (i.e., phrases such as “causes,” “is at the root of,” etc.). It should be noted however that the connection between levels of knowledge and attitudes is not a causal one in the strict sense of the word. It is concerned with the ways in which individuals reach judgment and form attitudes. (Admittedly, causality is often interpreted more loosely in social sciences than in natural sciences.) One related rather an absurd implication of a strict reading of the deficit model type of thinking is that the absence of something (i.e., lack of knowledge) causes something else (negative attitudes).

  20. This is not to say that religious and/or ethical beliefs would preclude factual knowledge. In forming ethical beliefs (X is morally good/bad, right/wrong, desirable/acceptable/prohibited, etc.) the knowledge component is typically considered to play an important justifying role.

  21. This mismatch might be a problem especially in the context of the study of complex systems such as GM agriculture and climate change.

  22. It might be useful to distinguish between two types of knowledge, namely “knowledge that” (i.e., propositional knowledge) and “knowing something” (knowledge as familiarity, being able to recognize or identify something or somebody). A fear of the unknown in many forms is common for human beings. (Natural curiosity, however, pulls somewhat the opposite direction.) Now if fears and other opposition are based on “not knowing” in the latter sense, communicating the facts might not be as useful as direct exposure to the particular technological innovation in question.

  23. Anxiety, hostility, and some forms of fear have somewhat lesser cognitive components in them than do other kinds of negative attitudes. Some forms of fear are instinctive, others are more cognitive. As an example of the latter I might be afraid of unemployment if that would result in a failure to pay my monthly mortgage repayments.

  24. It is difficult to say to what extent the public opposition is due to cognitive reasons, and I do not pretend to claim that the public would always be knowledgeable or rational in their opposition. The “yack” factor surely plays a role. Sometimes emotional responses may however provide useful knowledge or advice to us, e.g., as warnings of potential dangers and directing attention.

  25. The term “value” is understood here in a very general sense. Moral judgments—such as the belief that causing unnecessary pain is morally wrong—are one instance of them.

  26. To approach the issue of relevance from the opposite direction, the question is what normative conclusions a certain piece or a body of scientific knowledge grants. For seminal work discussing and questioning the “is”/“ought” gap see e.g., Foot (1958), Frankena (1939), Searle (1964).

  27. The deficit model (type of thinking) does not specify whether or not the basis of negative attitudes is cognitive or non-cognitive. Opposition may be based on reasons or on other factors. Nor does it specify whether the lack of scientific knowledge refers to one’s having no factual beliefs on the subject or to scientifically mistaken (i.e., unscientific) beliefs. It is sensible to assume that both are included. Moreover, in its basic form the deficit model does not say anything about the presence or absence of non-scientific beliefs, such as ethical beliefs, certain religious beliefs, and superstition. The deficit model, however, implies that the positive attitudes towards or support for (specific instance of) science and technology arising from uptake of information are cognitive. This follows because it is scientific knowledge that makes the difference. Attitudes change from negative towards positive owing to scientific knowledge. Either (1) scientific knowledge “causes” the change from non-cognitive opposition to cognitive support, or (2) scientific knowledge adds in factual beliefs where there were previously none and results in cognitive support, or (3) earlier scientifically mistaken beliefs are revised in the light of new scientific information and this results in cognitive support.

  28. Another issue is that the freedom of scientific research on GMOs may be compromised in an undesirable way by intellectual property rights of industry as suggested by the editors of Scientific American (Editors 2009). Some seed companies have veto power over the work of independent researchers on their crops.

  29. The connection between certain assumptions may seem—and in fact be—somewhat contradictory. An example: if one totally lacks understanding and knowledge about something, it might be questioned how s/he could have negative attitudes towards that particular thing because s/he obviously fails to identify the thing in question.

References

  • Ahteensuu, M., & Siipi, H. (2009). A critical assessment of public consultations on GMOs in the European Union. Environmental Values, 18(2), 129–152.

    Article  Google Scholar 

  • Bonny, S. (2003). Why are most Europeans opposed to GMOs? Factors explaining rejection in France and Europe. Electronic Journal of Biotechnology, 6, 50–71.

    Article  Google Scholar 

  • Brown, S. (2009). The new deficit model. Nature Nanotechnology, 4, 609–611.

    Article  Google Scholar 

  • Bubela, T., et al. (2009). Science communication reconsidered. Nature Biotechnology, 27, 514–518.

    Article  Google Scholar 

  • Bucchi, M., & Neresini, F. (2002). Biotech remains unloved by the more informed. Nature, 416, 261.

    Article  Google Scholar 

  • Cook, G., Pieri, E., & Robbins, P. T. (2004). The scientists think and the public feels: Expert perceptions of the discourse of GM food. Discourse and Society, 15(4), 433–449.

    Article  Google Scholar 

  • Currall, S. C. (2008). New insights into public perceptions. Nature Nanotechnology, 4, 79–80.

    Article  Google Scholar 

  • Currall, S. C., et al. (2006). What drives public acceptance of nanotechnology? Nature Nanotechnology, 1, 153–155.

    Article  Google Scholar 

  • Currall, S. C., et al. (2007). Authors’ response. Nature Nanotechnology, 2, 327–328.

    Article  Google Scholar 

  • Dickson, D. (2005). The case of ‘deficit model’ of science communication. Science and Development Network. http://www.scidev.net/en/editorials/the-case-for-a-deficit-model-of-science-communic.html. Accessed on November 17, 2010.

  • Durant, J. R., Evans, G. A., & Thomas, G. P. (1989). The public understanding of science. Nature, 340, 11–14.

    Article  Google Scholar 

  • Editors. (2009). A seedy practice. Scientific American, 301(2), 22.

    Article  Google Scholar 

  • EEA = European Environment Agency. (2001). Late lessons from early warnings: The precautionary principle 18962000. http://reports.eea.eu.int/environmental_issue_report_2001_22/en/Issue_Report_No_22.pdf. Accessed on November 17, 2010.

  • Einsiedel, E. (2007). Editorial: Of publics and science. Public Understanding of Science, 16(1), 5–6.

    Article  Google Scholar 

  • European Commission. (2005). Special eurobarometer 224: Europeans, science and technology.

  • European Commission. (2008a). Special eurobarometer 295: Attitudes of European citizens towards the environment.

  • European Commission. (2008b). Qualitative study on the image of science and the research policy of the European Union.

  • European Commission. (2010). Special eurobarometer 340: Science and technology.

  • Evans, R. (2008). The sociology of expertise: The distribution of social fluency. Sociology Compass, 2(1), 281–298.

    Article  Google Scholar 

  • Evans, G. A., & Durant, J. R. (1995). The relationship between knowledge and attitudes in the public understanding of science in Britain. Public Understanding of Science, 4, 57–74.

    Article  Google Scholar 

  • Foot, P. (1958). Moral arguments. Mind, 67(268), 502–513.

    Article  Google Scholar 

  • Frankena, W. K. (1939). The naturalistic fallacy. Mind, 48(192), 464–477.

    Article  Google Scholar 

  • Gaskell, G., et al. (2006). Europeans and biotechnology in 2005: Patterns and trends: Eurobarometer 64.3. http://www.ec.europa.eu/research/press/2006/pdf/pr1906_eb_64_3_final_report-may2006_en.pdf. Accessed on November 17, 2010.

  • Gaskell, G., Allum, N., & Stares, S. (2003). Europeans and biotechnology in 2002: Eurobarometer 58.0.

  • Gaskell, G., et al. (1999). Worlds apart? The reception of genetically modified foods in Europe and the US. Nature, 285, 384–387.

    Google Scholar 

  • Gregory, J., & Lock, J. (2008). The evolution of ‘Public Understanding of Science’: Public engagement as a tool of science policy in the UK. Sociology Compass, 2(4), 1252–1265.

    Article  Google Scholar 

  • Hails, R., & Kinderlerer, J. (2003). The GM public debate: Context and communication strategies. Nature Reviews, 4, 819–825.

    Google Scholar 

  • Hansen, J., et al. (2003). Beyond the knowledge deficit: Recent research into lay and expert attitudes to food risks. Appetite, 41, 111–121.

    Article  Google Scholar 

  • Hansson, S. O. (2008). Science and pseudo-science. Stanford encyclopedia of philosophy. http://plato.stanford.edu/entries/pseudo-science/. Accessed on September 27, 2010.

  • INRA (Europe)—ECOSA. (2000). Eurobarometer 52.1.: The Europeans and biotechnology.

  • Irwin, A., & Wynne, B. (Eds.). (1996). Misunderstood science? The public reconstruction of science and technology. Cambridge: Cambridge University Press.

    Google Scholar 

  • Jones, M. (2008). Fearing the fear of nanotechnology. Nature (Dec. 9), 1290.

  • Kahan, D. M., et al. (2009). Cultural cognition of the risks and benefits of nanotechnology. Nature Nanotechnology, 4, 87–90.

    Article  Google Scholar 

  • Kahneman, D. (2003). A perspective on judgment and choice: Mapping bounded rationality. American Psychologist, 58(9), 697–720.

    Article  Google Scholar 

  • Kvakkestad, V., et al. (2007). Scientists’ perspectives on the deliberate release of GM crops. Environmental Values, 16, 79–104.

    Article  Google Scholar 

  • Kvanvig, J. (2003). The value of knowledge and the pursuit of understanding. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Lidskog, R. (2008). Scientised citizens and democratised science: Re-assessing the expert-lay divide. Journal of Risk Research, 11(1–2), 69–86.

    Article  Google Scholar 

  • Louët, S. (2001). EC study reveals an informed public. Nature, 19, 15–16.

    Google Scholar 

  • Marris, C., et al. (2001). Public perceptions of agricultural biotechnologies in Europe. Final report of the PABE research project. Commissioned by the EC.

  • Martin, S., & Tait, J. (1992). Attitudes of selected public groups in the UK to biotechnology. In J. Durant (Ed.), Biotechnology in public: A review of recent research (pp. 28–41). London: Science Museum.

    Google Scholar 

  • Midden, C., et al. (2002). The structure of public perceptions. In M. W. Bauer & G. Gaskell (Eds.), Biotechnology: The making of a global controversy (pp. 203–223). Cambridge: Cambridge University Press.

    Google Scholar 

  • National Science Board. (2004). Science and engineering indicators 2004. Two volumes. Arlington, VA: National Science Foundation (volume 1, NSB 04-1; volume 2, NSB 04-1A).

  • National Science Board. (2010). Science and engineering indicators 2010. Arlington, VA: National Science Foundation (NSB 10–01).

    Google Scholar 

  • O’Neill, O. (2002). Autonomy and trust in bioethics. Port Chester, NY: Cambridge University Press.

    Book  Google Scholar 

  • Pardo, R., & Calvo, F. (2006). Are Europeans really antagonistic to biotech? Nature, 24(4), 393–395.

    Google Scholar 

  • Pardo, R., Midden, C., & Miller, J. D. (2002). Attitudes toward biotechnology in the European Union. Journal of Biotechnology, 98, 9–24.

    Article  Google Scholar 

  • Peters, H. P. (2000). From information to attitudes? Thoughts on the relationship between knowledge about science and technology and attitudes toward technology. In M. Dierkes & C. von Grote (Eds.), Between understanding and trust: The public, science and technology (pp. 265–286). Amsterdam: Harwood.

    Google Scholar 

  • Royal Society. (1985). The public understanding of science: Report of the royal society’s ad hoc group. London: The Royal Society.

    Google Scholar 

  • Royal Society. (2004). Science in society report. London: The Royal Society.

    Google Scholar 

  • Savadori, L., et al. (2004). Expert and public perceptions of risk from biotechnology. Risk Analysis, 24(5), 1289–1299.

    Article  Google Scholar 

  • Scheufele, D. A., et al. (2008a). Religious beliefs and public attitudes toward nanotechnology in Europe and the United States. Nature Nanotechnology, 4, 91–94.

    Article  Google Scholar 

  • Scheufele, D. A., et al. (2008b). Scientists worry about some risks more than the public. Nature Nanotechnology, 2, 732–734.

    Article  Google Scholar 

  • Searle, J. R. (1964). How to derive “ought” from “is”. Philosophical Review, 73(1), 43–58.

    Article  Google Scholar 

  • Slovic, P., Peters, E., Finucane, M. L., & MacGregor, D. (2005). Affect, risk, and decision making. Health Psychology, 24(4), S35–S40.

    Article  Google Scholar 

  • Sturgis, P., & Allum, N. (2004). Science in society: Re-evaluating the deficit model of public attitudes. Public Understanding of Science, 13, 55–74.

    Article  Google Scholar 

  • Sturgis, P., Cooper, H., & Fife-Schaw, C. (2005). Attitudes to biotechnology: Estimating the opinions of a better-informed public. New Genetics and Society, 24(1), 31–56.

    Article  Google Scholar 

  • Walton, D. (1992). Nonfallacious arguments from ignorance. American Philosophical Quarterly, 29(4), 381–387.

    Google Scholar 

  • Waltz, E. (2009). GM crops: Battlefield. Nature, 461, 27–32.

    Article  Google Scholar 

  • Whiteside, K. H. (2006). Precautionary politics: Principle and practice in confronting environmental risk. Cambridge, Mass: The MIT Press.

    Google Scholar 

  • Wiener, J. B., & Rogers, M. D. (2002). Comparing precaution in the United States and Europe. Journal of Risk Research, 5, 317–349.

    Article  Google Scholar 

  • Wintle, B., Burgman, M., & Fidler, F. (2007). How fast should nanotechnology advance? Nature Nanotechnology, 2, 327.

    Article  Google Scholar 

  • Wright, N., & Nerlich, B. (2006). Use of the deficit model in a shared culture of argumentation: The case of foot and mouth science. Public Understanding of Science, 15, 331–342.

    Article  Google Scholar 

  • Wynne, B. (1991). Knowledges in context. Science, Technology, and Human Values, 16, 111–121.

    Article  Google Scholar 

  • Wynne, B. (1996). May the sheep safely graze? A reflexive view of the expert-lay knowledge divide. In S. Lash, B. Szerszynski, & B. Wynne (Eds.), Risk, environment and modernity: Towards a new ecology (pp. 44–83). London: Sage.

    Google Scholar 

  • Wynne, B. (2006). Public engagement as means of restoring public trust in science–Hitting the notes, but missing the music. Community Genetics, 9, 211–220.

    Article  Google Scholar 

  • Ziman, J. (1991). Public understanding of science. Science, Technology, and Human Values, 16, 99–105.

    Article  Google Scholar 

Download references

Acknowledgments

This work has been financially supported by the Academy of Finland. During working on this paper I have greatly benefited from discussions with and specific suggestions made by Helena Siipi. I want to thank Rebecca Whitlock, attendees who commented on my presentation at the WCB2010 in Singapore, and participants of the PCRC and TMSC weekly seminars at the University of Turku, Finland, for useful comments on earlier versions of this paper. Three anonymous reviewers of Journal of Agricultural and Environmental Ethics made helpful points and suggestions. Remaining errors are mine.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marko Ahteensuu.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ahteensuu, M. Assumptions of the Deficit Model Type of Thinking: Ignorance, Attitudes, and Science Communication in the Debate on Genetic Engineering in Agriculture. J Agric Environ Ethics 25, 295–313 (2012). https://doi.org/10.1007/s10806-011-9311-9

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10806-011-9311-9

Keywords

Navigation