Advertisement

European Journal of Epidemiology

, Volume 33, Issue 11, pp 1021–1023 | Cite as

Massive citations to misleading methods and research tools: Matthew effect, quotation error and citation copying

  • John P. A. Ioannidis
COMMENTARY

Research methods and tools comprise a lion’s share among the most cited papers across science [1]. Methodological tools are essential to make discoveries, assess them, organize our knowledge, and understand which information is valid and useful. Many methods and research tools are proposed, but few become widely utilized. These are not always the best. For example, null-hypothesis significance testing with reporting of p-values is embedded in millions of papers [2], despite being a poor inferential method for most [3]. The factors that shape which methodological paper gets widely cited are poorly known. However, perhaps methods that are simple and easy to use (or misuse), and those that address major needs are more prone to become popular. Conversely, esoteric and convoluted tools, those that are not readily practicable, and those that have relevance only to rare circumstances are unlikely to become citation classics.

Simplicity is a desirable feature, but oversimplification is not....

References

  1. 1.
    Van Noorden R, Maher B, Nuzzo R. The top 100 papers. Nature. 2014;514(7524):550–3.CrossRefGoogle Scholar
  2. 2.
    Chavalarias D, Wallach JD, Li AH, Ioannidis JP. Evolution of reporting P values in the biomedical literature, 1990–2015. JAMA. 2016;315(11):1141–8.CrossRefGoogle Scholar
  3. 3.
    Nuzzo R. Statistical errors. Nature. 2014;506:150–2.CrossRefGoogle Scholar
  4. 4.
    Merton RK. The Matthew effect in science: the reward and communication systems of science are considered. Science. 1968;159(3810):56–63.CrossRefGoogle Scholar
  5. 5.
    Wetterer JK. Quotation error, citation copying, and ant extinctions in Madeira. Scientometrics. 2006;67:351–72.CrossRefGoogle Scholar
  6. 6.
    Simkin MV, Roychowdhury VP. Read before you cite! Complex Syst. 2003;14:269–74.Google Scholar
  7. 7.
    Stang A, Jonas S, Poole C. Case study in major quotation errors: a critical commentary of the Newcastle–Ottawa scale. Eur J Epidemiol. 2018.  https://doi.org/10.1007/s10654-018-0443-3.CrossRefPubMedGoogle Scholar
  8. 8.
    Stang A. Critical evaluation of the Newcastle–Ottawa scale for the assessment of the quality of nonrandomized studies in meta-406 analyses. Eur J Epidemiol. 2010;25:603–5.CrossRefGoogle Scholar
  9. 9.
    Wells GA, Shea B, O’Connell D, Peterson J, Welch V, Losos M, et al. The Newcastle–Ottawa Scale (NOS) for assessing the quality if nonrandomized studies in meta-analyses. http://www.ohrica/programs/clinical_epidemiology/oxfordasp. 2009.
  10. 10.
    Hartling L, Milne A, Hamm MP, Vandermeer B, Ansari M, Tsertsvadze A, et al. Testing the Newcastle Ottawa scale showed low reliability between individual reviewers. J Clin Epidemiol. 2013;66(9):982–93.CrossRefGoogle Scholar
  11. 11.
    Lo CK, Mertz D, Loeb M. Newcastle–Ottawa scale: comparing reviewers’ to authors’ assessments. BMC Med Res Methodol. 2014;14:45.CrossRefGoogle Scholar
  12. 12.
    Margulis AV, Pladevall M, Riera-Guardia N, Varas-Lorenzo C, Hazell L, Berkman ND, et al. Quality assessment of observational studies in a drug-safety systematic review, comparison of two tools: the Newcastle–Ottawa scale and the RTI item bank. Clin Epidemiol. 2014;6:359–68.CrossRefGoogle Scholar
  13. 13.
    Ioannidis JP, Lau J. Can quality of clinical trials and meta-analyses be quantified? Lancet. 1998;352(9128):590–1.CrossRefGoogle Scholar
  14. 14.
    Bellou V, Belbasis L, Tzoulaki I, Evangelou E, Ioannidis JP. Environmental risk factors and Parkinson’s disease: an umbrella review of meta-analyses. Parkinsonism Relat Disord. 2016;23:1–9.CrossRefGoogle Scholar
  15. 15.
    Eichorn P, Yankauer A. Do authors check their references? A survey of accuracy of references in three public health journals. Am J Public Health. 1987;77:1011–2.CrossRefGoogle Scholar
  16. 16.
    Jergas H, Baethge C. Quotation accuracy in medical journal articles-a systematic review and meta-analysis. PeerJ. 2015;3:e1364.CrossRefGoogle Scholar
  17. 17.
    Sterne JA, Sutton AJ, Ioannidis JP, Terrin N, Jones DR, Lau J, et al. Recommendations for examining and interpreting funnel plot asymmetry in meta-analyses of randomised controlled trials. BMJ. 2011;343:d4002.CrossRefGoogle Scholar
  18. 18.
    Egger M, Davey Smith G, Schneider M, Minder C. Bias in meta-analysis detected by a simple, graphical test. BMJ. 1997;315:629–34.CrossRefGoogle Scholar
  19. 19.
    Ioannidis JP, Trikalinos TA. The appropriateness of asymmetry tests for publication bias in meta-analyses: a large survey. CMAJ. 2007;176:1091–6.CrossRefGoogle Scholar
  20. 20.
    Lau J, Ioannidis JP, Terrin N, Schmid CH, Olkin I. The case of the misleading funnel plot. BMJ. 2006;333:597–600.CrossRefGoogle Scholar
  21. 21.
    Greenberg SA. How citation distortions create unfounded authority: analysis of a citation network. BMJ. 2009;339:b2680.CrossRefGoogle Scholar
  22. 22.
    Tatsioni A, Bonitsis NG, Ioannidis JP. Persistence of contradicted claims in the literature. JAMA. 2007;298:2517–26.CrossRefGoogle Scholar
  23. 23.
    Budd JM, Sievert M, Schultz TR. Phenomena of retraction: reasons for retraction and citations to the publications. JAMA. 1998;280(3):296–7.CrossRefGoogle Scholar
  24. 24.
    Lewis S, Clarke M. Forest plots: trying to see the wood and the trees. BMJ. 2001;322:1479–80.CrossRefGoogle Scholar
  25. 25.
    Ioannidis JP, Chang CQ, Lam TK, Schully SD, Khoury MJ. The geometric increase in meta-analyses from China in the genomic era. PLoS ONE. 2013;12(8):e65602.CrossRefGoogle Scholar
  26. 26.
    Ioannidis JP. The mass production of redundant, misleading, and conflicted systematic reviews and meta-analyses. Milbank Q. 2016;94:485–514.CrossRefGoogle Scholar
  27. 27.
    Quan W, Chen B, Shu F. Publish or impoverish: an investigation of the monetary reward system of science in China (1999–2016). Aslib J Inf Manag. 2017;69:1–18.Google Scholar
  28. 28.
    Hvistendahl M. China’s publication bazaar. Science. 2013;342:1035–9.CrossRefGoogle Scholar

Copyright information

© Springer Nature B.V. 2018

Authors and Affiliations

  1. 1.Meta-Research Innovation Center at Stanford (METRICS), and Departments of Medicine, Health Research and Policy, Biomedical Data Science, and StatisticsStanford UniversityStanfordUSA
  2. 2.Stanford Prevention Research CenterStanfordUSA

Personalised recommendations