Skip to main content
Log in

Abstract

Much like the trade and traits of bubbles in financial markets, similar bubbles appear on the science market. When economic bubbles burst, the drop in prices causes the crash of unsustainable investments leading to an investor confidence crisis possibly followed by a financial panic. But when bubbles appear in science, truth and reliability are the first victims. This paper explores how fashions in research funding and research management may turn science into something like a bubble economy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1

Similar content being viewed by others

Notes

  1. In the following, we conceive cognitive neuroscience broadly as to include a diverse set of disciplines ranging from neuropsychology and neurolinguistics to artificial intelligence and cognitive sociology. Most of these studies are based on advances in fMRI technologies. We do not discuss subcellular and molecular neuroscience, which are often less glamorous and more explanation oriented.

  2. A number of important papers on the economics of scientific knowledge have been collected in Stephan and Audretsch (2000) and in Mirowski and Sent (2002).

  3. The contours of the Brain Activity Map project was laid out in a white paper in 2011 which states that neuroscientists is “on the verge to illuminate… the impenetrable jungles of brain functions, by mapping and stimulating neural circuits with cellular and millisecond-level resolution.” This position paper was published in Neuron (Alivisatos Neuron 74, 970–974; 2012) and later handed over to the White House's Office of Science and Technology Policy and accepted for funding in 2013.

  4. Let us be clear on this point. We do not claim that cognitive neuroscience is flawed or irrelevant. Some neural correlations do aid psychological explanation in important respects. For example, identifying neural events underlying vision constrains explanations of timing in psychological processes and has helped predict psychological effects, as demonstrated by Burge (2010). Brain activity is necessary for psychological phenomena, but its relation to them is complex. Moreover, explanations of neurological phenomena are not themselves explanations of psychological phenomena. On the contrary, scientists can understand correlations between neural and psychological states only by engaging psychological explanations (see Bennett and Hacker 2003).

  5. The sample for the study was 49 meta-analyses, or studies that analyze data from other studies (in this case 730 individual neuroscience studies) published in 2011. According to the study, low statistical power (caused by low sample size, small effects, or both) negatively affects the likelihood that a nominally statistically significant finding actually reflects a true effect. Low-powered studies thus generate (1) low probability of finding true effects, (2) low positive predictive value, and (3) an exaggerated estimate of the magnitude of the effect, when a true effect is discovered (Button et al. 2013, p. 1–2).

  6. The authors suggest that when low-powered studies claim a scientific breakthrough, that breakthrough is more likely to be exaggerated or false. This has to do with the fact that the “probability of a research finding being true is related to the pre-study odds of that finding being true. These odds are higher for confirmatory or replication studies testing pre-specified hypotheses, as these have the weight of previous evidence or theory behind them.” The odds are lower for studies that make no prior predictions, leaving the findings more open to chance. (Botton 2013, p. 1)

  7. To test this hypothesis, the group examined people's judgments of explanations that either do or do not contain neuroscience information, but that otherwise do not differ in content or logic. All three studies (experts, students, and lay citizens) reportedly used a 2 (explanation type: good vs. bad) × 2 (neuroscience: without vs. with) design. This allowed the research group to see both people's baseline abilities to distinguish good psychological explanations from bad psychological explanations as well as any influence of neuroscience information on this ability (Weisberg et al. 2008, p. 470–71).

  8. Note that none of these informational phenomena is taken into account by the existing literature on the economics of scientific knowledge (Kitcher 1990; Goldman and Shaked 1991; Bonilla 2012). This literature generally suggests that scientific agents behave according to traditional axioms of rationalized economic behavior. However, in the analysis, we present here, we opt for integrating results from behavioral economics and social psychology to balance the idealized game-theoretical account of rational scientific agents in part of the economic literature on science (Franck 2002).

  9. Lakatos referred to the additional hypotheses supplementing the hard core as the protective belt, to underline its function of protecting the hard core from falsification: “We must use our ingenuity to articulate or even invent auxiliary hypotheses, which form a protective belt around this core, and we must redirect the modus tollens to these. It is this protective belt of auxiliary hypotheses which has to bear the brunt of tests and get adjusted and re-adjusted, or even completely replaced, to defend the thus-hardened core” (Lakatos 1978, p. 48).

  10. We have in mind, in particular, the scientific norms introduced by Robert K. Merton 1942 in his essay on the “The Normative Structure of Science.” Merton described four sets of institutional norms comprising the constitutive rules of science: 1. “Communalism” (scientific results are the common property of the entire scientific community); 2. “Universalism” (all scientists can contribute to science regardless of race, nationality, and gender); 3. “Disinterestedness” (scientists are expected to act for the benefit of the scientific enterprise, not for personal gain); and 4. “Organized Skepticism” (all scientific claims must be subjected to critical scrutiny). For a re-examination of the relevance of the Mertonian norms to contemporary academic life, see Macfarlane and Cheng (2008).

  11. As one of the anonymous reviewers of this article perceptively remarked, there is a great deal of literature dating back through Vannevar Bush and Michael Polanyi about the unpredictability of scientific investments. Indeed, this is crucial: If all members of a group make the decision to work on the same research topics, no one is going to work on alternative methodologies, which might eventually turn out to be more promising.

References

  • Alivisatos, N. (2012). Brain activity map. Neuron, 74, 970–974.

  • Bennett, M. R., & Hacker, P. M. S. (2003). Philosophical foundations of neuroscience. Malden: Blackwell.

    Google Scholar 

  • Bonilla, J. P. Z. (2012). The economics of scientific knowledge. In U. Mäki (Ed.), Handbook of the philosophy of science. The philosophy of economics. New York: Elsevier.

    Google Scholar 

  • Brown, N., & Michael, M. (2003). A sociology of expectations: retrospecting prospects and prospecting retrospects. Technology Assessment and Strategic Management, 15(1), 3–18.

    Article  Google Scholar 

  • Buchanan, M. (2008). “Why economic theory is out of whack”. New Scientist. July 19, 2008.

  • Budtz Pedersen, D. (2013). Research evaluation. Ethics, science, technology, and engineering. New York: Macmillan.

    Google Scholar 

  • Burge, T. (2010). “A real science of mind.” New York Times: The Stone. Dec 19, 2010. Accessed online 15 Aug 2013.

  • Butler, L. (2004). What happens when funding is linked to publication counts? In H. F. Moed, W. Glänzel, & U. Schmoch (Eds.), Handbook of quantitative science and technology research: the use of publication and patent statistics in studies of S&T systems. Dordrecht: Kluwer, 389–405.

  • Button, K. (2013). Unreliable neuroscience? Why power matters. The Guardian, Wednesday 10 April 2013.

  • Button, K. S., Ioannidis, J. P. A., Mokrysz, C., Nosek, B. A., Flint, J., Robinson, E. S. J., & Munafò, M. R. (2013). Power failure: why small sample size undermines the reliability of neuroscience. Nature Reviews Neuroscience, 14(5), 365–376.

    Article  Google Scholar 

  • Centola, D., Willer, R., & Macy, M. (2005). The emperor's dilemma: a computational model of self-enforcing norms. American Journal of Sociology, 110, 1009.

    Article  Google Scholar 

  • Chen, X. (1988). Reconstruction of the optical revolution. Proceedings of the Biennial Meeting of the Philosophy of Science Association, 1988, 103–109.

    Google Scholar 

  • Conway, B. R., & Rehding, A. (2013). Neuroaesthetics and the trouble with beauty. PLoS Biology, 11(3), e1001504.

    Article  Google Scholar 

  • Elger, E., Freiderichi, A., Koch, C., Luhmann, H., Malsburg, C., Menzel, R., et al. (2004). “The Manifesto” (original text in German). Brain and Mind, 6, 30–37.

    Google Scholar 

  • Elzinga, A. (2004). The new production of reductionism in research policy models. In K. Grandin (Ed.), The science-industry Nexus. Sagamore Beach, 277–304.

  • Felt, U. (2007). Taking European knowledge society seriously. Brussels: European Commission.

    Google Scholar 

  • Franck, G. (2002). The scientific economy of attention: a novel approach to the collective rationality of science. Scientometrics, 55(1), 3–26.

    Article  Google Scholar 

  • Gerrans, P. (2009). “Bubble trouble”. Times Higher Education, 9 July 2009.

  • Goldman, A., & Shaked, M. (1991). An economic model of scientific activity and truth acquisition. Philosophical Studies, 63, 31–55.

    Google Scholar 

  • Hansen, P. G., & Hendricks, V. F. (2014). Infostorms: how to take information punches and save democracy. New York: Copernicus Books.

    Google Scholar 

  • Hansen, P. G., Hendricks, V. F., & Rendsvig, R. K. (2013). Infostorms. Metaphilosophy, 44(3), 301–326.

    Article  Google Scholar 

  • Hedgecoe, A., & Martin, P. (2003). The drugs don't work: expectations and the shaping of pharmacogenetics. Social Studies of Science, 33(3), 327–364.

    Article  Google Scholar 

  • Hendricks, V. F., & Lundorff-Rasmussen, J. (2012). Nedtur! Finanskrisen forstået filosofisk. Copenhagen: Gyldendal Business.

    Google Scholar 

  • Hendricks, V.F. & Rendsvig, R.K. (2013). Structures of social proof (forthcoming).

  • Katz, D., & Allport, F. H. (1931). Student attitudes. Syracuse: Craftsman.

    Google Scholar 

  • Keats, J. (2013). “The $1.3B quest to build a supercomputer replica of a human brain”. Wired Magazine, London: July 2013, pp. 128–135.

  • Kitcher, P. (1990). The division of cognitive labor. The Journal of Philosophy, 87(1), 5–22.

    Article  Google Scholar 

  • Lakatos, I. (1978). The methodology of scientific research programme. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Latané, B., & Darley. (1968). Bystander apathy. American Scientist, 57, 244–268.

    Google Scholar 

  • Latané, B., & Nida, S. (1981). The year of research on group size and help. Psychological Bulletin, 89(2), 308–324.

    Article  Google Scholar 

  • Laudel, G. (2006). The art of getting funded: how scientists adapt to their funding conditions. Science and Public Policy, 33(7), 489–504.

    Article  Google Scholar 

  • Lee, I. H. (1998). Market crashes and informational avalanches. Review of Economic Studies, 65, 741–759.

    Article  Google Scholar 

  • Lütge, C. (2004). Economics in philosophy of science: can the dismal science contribute anything interesting? Synthese, 140(3), 279–305.

    Article  Google Scholar 

  • Macfarlane, B., & Cheng, M. (2008). Communism, universalism and disinterestedness: re-examining contemporary support among academics for Merton's scientific norms. Journal of Academic Ethics, 6, 67–78.

    Article  Google Scholar 

  • Merton, R. K. (1942). The normative structure of science. In R. K. Merton (Ed.), The sociology of science. Chicago: University of Chicago Press.

    Google Scholar 

  • Mirowski, P. (2012). The modern commercialization of science is a passel of Ponzi schemes. Social Epistemology: A Journal of Knowledge, Culture and Policy, 26(4), 285–310.

  • Mirowski, P., & Sent, E.-M. (2002). Science bought and sold. Chicago: University of Chicago Press.

    Google Scholar 

  • Nowotny, H., Scott, P., & Gibbons, M. (2001). Re-thinking science: knowledge and the public in an age of uncertainty. London: Polity.

  • Rip, A. (1988). Contextual transformations in contemporary science. In Keeping science straight: a critical look at the assessment of science and technology. University of Gothenburg Press, 59–85.

  • Rip, A. (2009). Futures of ELSA. EMBO Reports, 10(7), 666–670.

  • Robinson, M. (2010). The privatization of neuroscience: the university, the state and the moral aims of science. Somatosphere.net. Accessed 11 Jan 2011.

  • Rose, N. (2013). The human sciences in a biological age. Theory Culture Society, 30, 3–34.

    Article  Google Scholar 

  • Selin, C. (2007). Expectations and the emergence of nanotechnology. Science, Technology & Human Values, 32(2), 196–220.

    Article  Google Scholar 

  • Stephan, P., & Audretsch, D. B. (Eds.). (2000). The economics of science and of innovation. Volume 2. Cheltenham: Edward Elgar.

    Google Scholar 

  • Sun, R. et al. (2010). Proceedings of the workshop on cognitive social sciences: grounding the social sciences in the cognitive sciences. Portland, Oregon. August 11, 2010.

  • Tallis, R. (2011). Apeing mankind. Durham: Acumen.

    Google Scholar 

  • Vogel, H. L. (2010). Financial market bubbles and crashes. New York: Cambridge University Press.

    Google Scholar 

  • Wadman, M. (2013). Behind the scenes of a brain-mapping moon shot. Nature Online, 495(7439), 19. http://www.nature.com.

    Article  Google Scholar 

  • Weber, A.A. (2009). “Weber says ECB has used room to cut interest rates” www.bloomberg.com. Accessed May 2013.

  • Weingart, P. (2005). Impact of bibliometrics upon the science system: inadvertent consequences. Scientometrics, 62(1), 117–131.

    Google Scholar 

  • Weisberg, D. S., Keil, F. C., Goodstein, J., Rawson, E., & Gray, J. R. (2008). The seductive allure of neuroscience explanations. Journal of Cognitive Neuroscience, 20(3), 470–477.

    Article  Google Scholar 

  • Whitley, R., & Gläser, J. (Eds.). (2007). The changing governance of the sciences. The advent of research evaluation systems. Springer: Dordrecht.

  • Zeki, S. (2013). “Statement on neuroesthetics”. Online Journal of Neuroesthetics. www.neuroesthetics.org/statement-on-neuroesthetics.php.

  • Ziman, J. (2001). Real science. What it is, and what it means. Cambridge University Press.

Download references

Acknowledgments

We wish to thank the following colleagues for helpful comments and discussion: Alexandru Baltag, Patrick Blackburn, Johan van Benthem, Finn Collin, Claus Emmeche, Claus Strue Frederiksen, Søren Gosvig Olesen, Erik J. Olsson, Ramus K. Rendsvig, Dan Zahavi, Philip Pettit, John Symons, and the two anonymous reviewers. We gratefully acknowledge the support from The Velux Foundation and the Humanomics Research Programme.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vincent F. Hendricks.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Pedersen, D.B., Hendricks, V.F. Science Bubbles. Philos. Technol. 27, 503–518 (2014). https://doi.org/10.1007/s13347-013-0142-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13347-013-0142-7

Keywords

Navigation