Negative results are disappearing from most disciplines and countries
- 4.9k Downloads
Concerns that the growing competition for funding and citations might distort science are frequently discussed, but have not been verified directly. Of the hypothesized problems, perhaps the most worrying is a worsening of positive-outcome bias. A system that disfavours negative results not only distorts the scientific literature directly, but might also discourage high-risk projects and pressure scientists to fabricate and falsify their data. This study analysed over 4,600 papers published in all disciplines between 1990 and 2007, measuring the frequency of papers that, having declared to have “tested” a hypothesis, reported a positive support for it. The overall frequency of positive supports has grown by over 22% between 1990 and 2007, with significant differences between disciplines and countries. The increase was stronger in the social and some biomedical disciplines. The United States had published, over the years, significantly fewer positive results than Asian countries (and particularly Japan) but more than European countries (and in particular the United Kingdom). Methodological artefacts cannot explain away these patterns, which support the hypotheses that research is becoming less pioneering and/or that the objectivity with which results are produced and published is decreasing.
KeywordsBias Misconduct Research evaluation Publication Publish or perish Competition
Robin Williams gave helpful comments, and François Briatte crosschecked the coding protocol. This work was supported by a Marie Curie Intra-European Fellowship (Grant Agreement Number PIEF-GA-2008-221441) and a Leverhulme Early-Career fellowship (ECF/2010/0131).
- Doucouliagos, H., Laroche, P., & Stanley, T. D. (2005). Publication bias in union-productivity research? Relations Industrielles-Industrial Relations, 60(2), 320–347.Google Scholar
- Feigenbaum, S., & Levy, D. M. (1996). Research bias: Some preliminary findings. Knowledge and Policy: The International Journal of Knowledge Transfer and Utilization, 9(2 & 3), 135–142.Google Scholar
- Meho, L. I. (2007). The rise and rise of citation analysis. Physics World, 20(1), 32–36.Google Scholar
- Shelton, R. D., Foland, P., & Gorelskyy, R. (2007). Do new SCI journals have a different national bias? Proceedings of ISSI 2007: 11th international conference of the international society for scientometrics and informetrics, Vols I and II (pp. 708–717).Google Scholar
- Tsang, E. W. K., & Kwan, K. M. (1999). Replication and theory development in organizational science: A critical realist perspective. Academy of Management Review, 24(4), 759–780.Google Scholar