Positive results receive more citations, but only in some disciplines
- 752 Downloads
Negative results are commonly assumed to attract fewer readers and citations, which would explain why journals in most disciplines tend to publish too many positive and statistically significant findings. This study verified this assumption by counting the citation frequencies of papers that, having declared to “test” a hypothesis, reported a “positive” (full or partial) or a “negative” (null or negative) support. Controlling for various confounders, positive results were cited on average 32 % more often. The citation advantage, however, was unequally distributed across disciplines (classified as in the Essential Science Indicators database). Using Space Science as the reference category, the citation differential was positive and formally statistically significant only in Neuroscience & Behaviour, Molecular Biology & Genetics, Clinical Medicine, and Plant and Animal Science. Overall, the effect was significantly higher amongst applied disciplines, and in the biological compared to the physical and the social sciences. The citation differential was not a significant predictor of the actual frequency of positive results amongst the 20 broad disciplines considered. Although future studies should attempt more fine-grained assessments, these results suggest that publication bias may have different causes and require different solutions depending on the field considered.
KeywordsBias File-drawer Citations Competition Publication Research evaluation
- Doucouliagos, H., Laroche, P., & Stanley, T. D. (2005). Publication bias in union-productivity research? Relations Industrielles-Industrial Relations, 60(2), 320–347.Google Scholar
- Fanelli, D. (2010a). Do pressures to publish increase scientists’ bias? An Empirical Support from US States Data. PLoS ONE, 5(4). doi:10.1371/journal.pone.0010271.
- Fanelli, D. (2010b). “Positive” results increase down the hierarchy of the sciences. PLoS ONE, 5(3). doi:10.1371/journal.pone.0010068.
- Greenberg, S. A. (2009). How citation distortions create unfounded authority: analysis of a citation network. British Medical Journal, 339. doi:10.1136/bmj.b2680.
- Song, F., Parekh, S., Hooper, L., Loke, Y. K., Ryder, J., Sutton, A. J., et al. (2010). Dissemination and publication of research findings: An updated review of related biases. Health Technology Assessment, 14(8). doi:10.3310/hta14080.