Skip to main content

Why Too Many Political Science Findings Cannot Be Trusted and What We Can Do About It: A Review of Meta-Scientific Research and a Call for Academic Reform

Forschung als soziales Dilemma: Eine meta-wissenschaftliche Bestandsaufnahme zur Glaubwürdigkeit politikwissenschaftlicher Befunde und ein Appell zur Veränderung akademischer Anreizstrukturen

Abstract

Witnessing the ongoing “credibility revolutions” in other disciplines, political science should also engage in meta-scientific introspection. Theoretically, this commentary describes why scientists in academia’s current incentive system work against their self-interest if they prioritize research credibility. Empirically, a comprehensive review of meta-scientific research with a focus on quantitative political science demonstrates that threats to the credibility of political science findings are systematic and real. Yet, the review also shows the discipline’s recent progress toward more credible research. The commentary proposes specific institutional changes to better align individual researcher rationality with the collective good of verifiable, robust, and valid scientific results.

Zusammenfassung

Angesichts der „Glaubwürdigkeitsrevolutionen“ in anderen Sozialwissenschaften liegen Fragen nach der Verlässlichkeit institutioneller Wissensproduktion auch in der Politikwissenschaft nahe. Dieser Kommentar beschreibt, warum Wissenschaftler entgegen ihrem Eigeninteresse handeln, wenn sie Forschungsvalidität priorisieren. Ein umfassender Überblick der meta-wissenschaftlichen Literatur mit Fokus auf der quantitativen Politikwissenschaft weist einerseits auf jüngst eingeleitete Reformen zur Sicherung reliabler Forschung hin. Andererseits offenbart der vorliegende Überblicksartikel systematische Probleme in der Glaubwürdigkeit veröffentlichter Forschungsbefunde. Dieser Kommentar schlägt konkrete Maßnahmen vor, individuelle Forscheranreize in Einklang zu bringen mit dem gemeinschaftlichen Ziel verlässlicher Forschung.

This is a preview of subscription content, access via your institution.

Notes

  1. Different lines of thought in the Open Science movement comprise “the infrastructure school (which is concerned with the technological architecture), the public school (which is concerned with the accessibility of knowledge creation), the measurement school (which is concerned with alternative impact measurement), the democratic school (which is concerned with access to knowledge) and the pragmatic school (which is concerned with collaborative research)” (Fecher and Friesike 2014: 17).

  2. Even if similar discussions are gaining traction in other research cultures (Monroe 2018; Elman et al. 2018; Janz 2018), this commentary focuses on quantitative political science as published in English-language peer-reviewed journals, which has attracted most meta-scientific attention in recent years. Although it is a debate worth having, it is beyond the scope of this commentary to discuss how the evidence and arguments presented here can be applied to other research cultures and publication formats in political science.

  3. Freese and Peterson (2017) call this type of replication verifiability.

  4. These criteria can be ordered hierarchically in the sense that the latter are more likely fulfilled when the former are met.

  5. Note that making one’s work accessible to inter-subjective assessment goes beyond data transparency and includes disclosure of data-analytical and processing procedures. Stockemer et al. (2018) discuss cases in which attempts to replicate prior results failed because neither the syntax nor the published article provided sufficient information to repeat the authors’ analytical steps.

  6. See https://opennessinitiative.org/. Accessed 20 September 2018.

  7. See https://politicalsciencereplication.wordpress.com/2015/05/04/leading-journal-verifies-articles-before-publication-so-far-all-replications-failed/. Accessed 28 August 2018.

  8. For details on the costs of AJPS’s verification processes, see https://www.insidehighered.com/blogs/rethinking-research/should-journals-be-responsible-reproducibility. Accessed 20 September 2018.

  9. Esarey and Wu (2016) estimate that the true value of statistical relationships is on average 40% smaller than their published value.

  10. Note that HARKing and confirmatory analysis are perfectly reconcilable if the new data is collected before both analytical steps but not without the collection of new data: “Just as conspiracy theories are never falsified by the facts that they were designed to explain, a hypothesis that is developed on the basis of exploration of a data set is unlikely to be refuted by that same data. Thus, one always needs a fresh data set for testing one’s hypothesis.” (Wagenmakers et al. 2012, p. 633).

  11. https://www.theatlantic.com/science/archive/2018/08/scientists-can-collectively-sense-which-psychology-studies-are-weak/568630/. Accessed 28 August 2018.

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexander Wuttke.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Wuttke, A. Why Too Many Political Science Findings Cannot Be Trusted and What We Can Do About It: A Review of Meta-Scientific Research and a Call for Academic Reform. Polit Vierteljahresschr 60, 1–19 (2019). https://doi.org/10.1007/s11615-018-0131-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11615-018-0131-7

Keywords

  • Open Science
  • Publication bias
  • Replication crisis
  • Replicability
  • Transparency

Schlüsselwörter

  • Offene Wissenschaft
  • Publikationsbias
  • Replikationskrise
  • Reproduzierbarkeit
  • Transparenz