Advertisement

Why Too Many Political Science Findings Cannot Be Trusted and What We Can Do About It: A Review of Meta-Scientific Research and a Call for Academic Reform

  • Alexander Wuttke
Kommentar

Abstract

Witnessing the ongoing “credibility revolutions” in other disciplines, political science should also engage in meta-scientific introspection. Theoretically, this commentary describes why scientists in academia’s current incentive system work against their self-interest if they prioritize research credibility. Empirically, a comprehensive review of meta-scientific research with a focus on quantitative political science demonstrates that threats to the credibility of political science findings are systematic and real. Yet, the review also shows the discipline’s recent progress toward more credible research. The commentary proposes specific institutional changes to better align individual researcher rationality with the collective good of verifiable, robust, and valid scientific results.

Keywords

Open Science Publication bias Replication crisis Replicability Transparency 

Forschung als soziales Dilemma: Eine meta-wissenschaftliche Bestandsaufnahme zur Glaubwürdigkeit politikwissenschaftlicher Befunde und ein Appell zur Veränderung akademischer Anreizstrukturen

Zusammenfassung

Angesichts der „Glaubwürdigkeitsrevolutionen“ in anderen Sozialwissenschaften liegen Fragen nach der Verlässlichkeit institutioneller Wissensproduktion auch in der Politikwissenschaft nahe. Dieser Kommentar beschreibt, warum Wissenschaftler entgegen ihrem Eigeninteresse handeln, wenn sie Forschungsvalidität priorisieren. Ein umfassender Überblick der meta-wissenschaftlichen Literatur mit Fokus auf der quantitativen Politikwissenschaft weist einerseits auf jüngst eingeleitete Reformen zur Sicherung reliabler Forschung hin. Andererseits offenbart der vorliegende Überblicksartikel systematische Probleme in der Glaubwürdigkeit veröffentlichter Forschungsbefunde. Dieser Kommentar schlägt konkrete Maßnahmen vor, individuelle Forscheranreize in Einklang zu bringen mit dem gemeinschaftlichen Ziel verlässlicher Forschung.

Schlüsselwörter

Offene Wissenschaft Publikationsbias Replikationskrise Reproduzierbarkeit Transparenz 

References

  1. Angrist, Joshua D., and Jörn-Steffen Pischke. 2010. The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics. Journal of Economic Perspectives 24(2):3–30.CrossRefGoogle Scholar
  2. Brodeur, Abel, Nikolai Cook, and Anthony Heyes. 2018. Methods matter: p‑hacking and causal inference in economics. IZA discussion papers, Vol. 11796. Bonn: Institute for the Study of Labor (IZA). http://ftp.iza.org/dp11796.pdf. Accessed 10 September 2018.Google Scholar
  3. Burlig, Fiona. 2018. Improving transparency in observational social science research: A pre-analysis plan approach. Economics Letters 168:56–60.CrossRefGoogle Scholar
  4. Camerer, Colin F., Anna Dreber, Eskil Forsell, Teck-Hua Ho, Jürgen Huber, Magnus Johannesson, Michael Kirchler, Johan Almenberg, Adam Altmejd, Chan Taizan, Emma Heikensten, Felix Holzmeister, Taisuke Imai, Siri Isaksson, Gideon Nave, Thomas Pfeiffer, Michael Razen, and Hang Wu. 2016. Evaluating replicability of laboratory experiments in economics. Science 351:1433–1436.  https://doi.org/10.1126/science.aaf0918.CrossRefGoogle Scholar
  5. Camerer, Colin F., Anna Dreber, Felix Holzmeister, Teck-Hua Ho, Jürgen Huber, Magnus Johannesson, Michael Kirchler, Gideon Nave, Brian A. Nosek, Thomas Pfeiffer, Adam Altmejd, Nick Buttrick, Chan Taizan, Yiling Chen, Eskil Forsell, Anup Gampa, Emma Heikensten, Lily Hummer, Taisuke Imai, Siri Isaksson, Dylan Manfredi, Julia Rose, Eric-Jan Wagenmakers, and Hang Wu. 2018. Evaluating the replicability of social science experiments in nature and science between 2010 and 2015. Nature Human Behaviour 2:637–644.  https://doi.org/10.1038/s41562-018-0399-z.CrossRefGoogle Scholar
  6. Chambers, Chris. 2017. The Seven Deadly Sins of Psychology. A Manifesto for Reforming the Culture of Scientific Practice. Princeton: Princeton University Press.CrossRefGoogle Scholar
  7. Chambers, Chris, and Pete Etchells. 2018. Open science is now the only way forward for psychology, 23.08.2018. https://www.theguardian.com/science/head-quarters/2018/aug/23/open-science-is-now-the-only-way-forward-for-psychology. Accessed 31 August 2018.Google Scholar
  8. Cingranelli, David, and Mikhail Filippov. 2018. Are Human Rights Practices Improving? American Political Science Review 112(4):1083–1089.  https://doi.org/10.1017/S0003055418000254.CrossRefGoogle Scholar
  9. Cook, Bryan G., John Lloyd Wills, David Mellor, Brian A. Nosek, and William J. Therrien. 2018. Promoting Open Science to Increase the Trustworthiness of Evidence in Special Education. Exceptional Children 85(1):104–118.  https://doi.org/10.1177/0014402918793138.CrossRefGoogle Scholar
  10. Elman, Colin, Diana Kapiszewski, and Arthur Lupia. 2018. Transparent Social Inquiry: Implications for Political Science. Annual Review of Political Science 21(1):29–47.  https://doi.org/10.1146/annurev-polisci-091515-025429.CrossRefGoogle Scholar
  11. Esarey, Justin, and Ahra Wu. 2016. Measuring the effects of publication bias in political science. Research & Politics 3(3):1–9.  https://doi.org/10.1177/2053168016665856.CrossRefGoogle Scholar
  12. Fecher, B., and Sascha Friesike. 2014. Open Science: One Term, Five Schools of Thought. In Opening Science: The Evolving Guide on How the Internet is Changing Research, Collaboration and Scholarly Publishing, ed. Sascha Friesike, Sönke Bartling, 17–47. Wiesbaden, Springer VS.CrossRefGoogle Scholar
  13. Fiedler, Klaus, and Norbert Schwarz. 2016. Questionable Research Practices Revisited. Social Psychological and Personality Science 7(1):45–52.  https://doi.org/10.1177/1948550615612150.CrossRefGoogle Scholar
  14. Findley, Michael G., Nathan M. Jensen, Edmund J. Malesky, and Thomas B. Pepinsky. 2016. Can Results-Free Review Reduce Publication Bias? The Results and Implications of a Pilot Study. Comparative Political Studies 49:1667–1703.  https://doi.org/10.1177/0010414016655539.CrossRefGoogle Scholar
  15. Fox, Nick, Nathan Honeycutt, and Lee Jussim. 2018. How many psychologists use questionable research practices? Estimating the population size of current QRP users. Preprint PsyArXiv. https://psyarxiv.com/3v7hx/download?format=pdf. Accessed 31 August 2018.Google Scholar
  16. Franco, Annie, Neil Malhotra, and Gabor Simonovits. 2014. Publication bias in the social sciences: Unlocking the file drawer. Science 345(6203):1502–1505. http://science.sciencemag.org/content/345/6203/1502.full.pdf. Accessed 30 August 2018.Google Scholar
  17. Franco, Annie, Neil Malhotra, and Gabor Simonovits. 2015. Underreporting in Political Science Survey Experiments: Comparing Questionnaires to Published Results. Political Analysis 23(2):306–312.CrossRefGoogle Scholar
  18. Freese, Jeremy, and David Peterson. 2017. Replication in Social Science. Annual Review of Sociology 43:147–165.  https://doi.org/10.1146/annurev-soc-060116-053450.CrossRefGoogle Scholar
  19. Gerber, A., and Neil Malhotra. 2008. Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals. Quarterly Journal of Political Science 3(3):313–326.  https://doi.org/10.1561/100.00008024.CrossRefGoogle Scholar
  20. Gerber, Alan S., Donald P. Green, and David Nickerson. 2001. Testing for Publication Bias in Political Science. Political Analysis 9(4):385–392.  https://doi.org/10.1093/oxfordjournals.pan.a004877.CrossRefGoogle Scholar
  21. Gerber, Alan S., Neil Malhotra, Conor M. Dowling, and David Doherty. 2010. Publication Bias in Two Political Behavior Literatures. American Politics Research 38(4):591–613.CrossRefGoogle Scholar
  22. Gernsbacher, Morton A. 2018. Rewarding Research Transparency. Trends in Cognitive Sciences 22(11):953–956. https://doi.org/10.1016/j.tics.2018.07.002.CrossRefGoogle Scholar
  23. Gertler, Aaron L., and John G. Bullock. 2017. Reference Rot: An Emerging Threat to Transparency in Political Science. PS: Political Science & Politics 50(1):166–171.  https://doi.org/10.1017/S1049096516002353.CrossRefGoogle Scholar
  24. Gervais, Will M., and Ara Norenzayan. 2018. Analytic atheism revisited. Nature Human Behaviour 2. https://doi.org/10.1038/s41562-018-0426-0.CrossRefGoogle Scholar
  25. Ghergina, Sergiu, and Alexia Katsanidou. 2013. Data Availability in Political Science Journals. European Political Science 12(3):333–349.  https://doi.org/10.1057/eps.2013.8.CrossRefGoogle Scholar
  26. Hardwicke, Tom E., and John P.A. Ioannidis. 2018. Mapping the Universe of Registered Reports. BITSS Preprints. https://osf.io/preprints/bitss/fzpcy/. Accessed 10 September 2018.
  27. Ho, Daniel E., Kosuke Imai, Gary King, and Elizabeth A. Stuart. 2007. Matching as Nonparametric Preprocessing for Reducing Model Dependence in Parametric Causal Inference. Political Analysis 15(3):199–236.  https://doi.org/10.1093/pan/mpl013.CrossRefGoogle Scholar
  28. Humphreys, Macartan. 2018. Declare Design. Presentation at the University of Mannheim, 10.08.2018.Google Scholar
  29. Humphreys, Macartan, Raul Sanchez de la Sierra, and Peter van der Windt. 2013. Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration. Political Analysis 21(1):1–20.  https://doi.org/10.1093/pan/mps021.CrossRefGoogle Scholar
  30. Ioannidis, John P. A. 2005. Why Most Published Research Findings Are False. PLoS Med 2(8):e124.  https://doi.org/10.1371/journal.pmed.0020124.CrossRefGoogle Scholar
  31. Ishiyama, John. 2014. Replication, Research Transparency, and Journal Publications: Individualism, Community Models, and the Future of Replication Studies. PS: Political Science & Politics 47(1):78–83.  https://doi.org/10.1017/S1049096513001765.CrossRefGoogle Scholar
  32. Janz, Nicole. 2018. Replication and transparency in political science—did we make any progress? https://politicalsciencereplication.wordpress.com/2018/07/14/replication-and-transparency-in-political-science-did-we-make-any-progress/. Accessed 14 June 2018.Google Scholar
  33. Kaplan, Robert M., and Veronica L. Irvin. 2015. Likelihood of Null Effects of Large NHLBI Clinical Trials Has Increased over Time. PLoS ONE 10(8):e0132382.  https://doi.org/10.1371/journal.pone.0132382.CrossRefGoogle Scholar
  34. Kerr, Norbert L. 1998. HARKing: HARKing: Hypothesizing After the Results are Known. Personality and Social Psychology Review 2(3):196–217.  https://doi.org/10.1207/s15327957pspr0203_4.CrossRefGoogle Scholar
  35. Key, Ellen M. 2016. How Are We Doing? Data Access and Replication in Political Science. PS: Political Science & Politics 49(2):268–272.  https://doi.org/10.1017/S1049096516000184.CrossRefGoogle Scholar
  36. King, Gary. 1995. Replication, Replication. PS: Political Science & Politics 28(3):444–452.  https://doi.org/10.2307/420301.CrossRefGoogle Scholar
  37. LeBel, Etienne P., Randy J. McCarthy, Brian D. Earp, Malte Elson, and Wolf Vanpaemel. 2018. A Unified Framework to Quantify the Credibility of Scientific Findings. Advances in Methods and Practices in Psychological Science 1(3):389–402.  https://doi.org/10.1177/2515245918787489.CrossRefGoogle Scholar
  38. Lenz, Gabriel, and Alexander Sahn. 2017. Achieving Statistical Significance with Covariates. BITSS Preprints. https://osf.io/preprints/bitss/s42ba/download?format=pdf. Accessed 10 September 2018.Google Scholar
  39. Lupia, Arthur, and Colin Elman. 2014. Openness in Political Science: Data Access and Research Transparency: Introduction. PS: Political Science & Politics 47(1):19–42.  https://doi.org/10.1017/S1049096513001716.CrossRefGoogle Scholar
  40. Mellor, David, Alexandra Hartman, and Florian Kern. 2018. Preregistration for Qualitative Research Template. Open Science Framework (OSF). https://osf.io/j7ghv/. Accessed 28 September 2018.Google Scholar
  41. Monroe, Kristen R. 2018. The Rush to Transparency: DA-RT and the Potential Dangers for Qualitative Research. Perspectives on Politics 16(1):141–148.  https://doi.org/10.1017/S153759271700336X.CrossRefGoogle Scholar
  42. Montgomery, Jacob M., and Brendan Nyhan. 2010. Bayesian Model Averaging: Theoretical Developments and Practical Applications. Political Analysis 18(2):245–270.  https://doi.org/10.1093/pan/mpq001.CrossRefGoogle Scholar
  43. Motyl, Matt, Alexander P. Demos, Timothy S. Carsel, Brittany E. Hanson, Zachary J. Melton, Allison B. Mueller, J. P. Prims, Jiaqing Sun, Anthony N. Washburn, Kendal M. Wong, Caitlyn Yantis, and Linda J. Skitka. 2017. The state of social and personality science: Rotten to the core, not so bad, getting better, or getting worse? Journal of Personality and Social Psychology 113(1):34–58.CrossRefGoogle Scholar
  44. Nelson, Leif D., Joseph Simmons, and Uri Simonsohn. 2018. Psychology’s Renaissance. Annual Review of Psychology 69:511–534.  https://doi.org/10.1146/annurev-psych-122216-011836.CrossRefGoogle Scholar
  45. Nosek, Brian A., Charles R. Ebersole, Alexander C. DeHaven, and David T. Mellor. 2018. The preregistration revolution. Proceedings of the National Academy of Sciences 115(11):2600–2606.  https://doi.org/10.1073/pnas.1708274114.CrossRefGoogle Scholar
  46. Open Science Collaboration. 2015. Estimating the reproducibility of psychological science. Science 349(6251):aac4716. http://doi.org/10.1126/science.aac4716.Google Scholar
  47. Pearl, Judea, and Dana Mackenzie. 2018. The Book of Why: The New Science of Cause and Effect. New York: Basic Books.Google Scholar
  48. Rinke, Eike M., and Frank M. Schneider. 2015. Probabilistic misconceptions are pervasive among communication researchers. 65th Annual Conference of the International Communication Association, San Juan, Puerto Rico, 21.–25.05.2015. Mannheim: MZES University of Mannheim.Google Scholar
  49. Rohrer, Julia, Boris Egloff, and Stefan C. Schmukle. 2017. Probing Birth-Order Effects on Narrow Traits Using Specification-Curve Analysis. Psychological Science 28(12):1821–1832.  https://doi.org/10.1177/0956797617723726.CrossRefGoogle Scholar
  50. Scheliga, Kaja, and Sascha Friesike. 2014. Putting open science into practice: A social dilemma? First Monday 19(9). http://dx.doi.org/10.5210/fm.v19i9.5381. CrossRefGoogle Scholar
  51. Schönbrodt, Felix, and David Mellor. 2018. Academic job offers that mentioned open science. Open Science Framework. https://osf.io/7jbnt/. Accessed 13 October 2018.Google Scholar
  52. Shweder, Richard A., and Donald Winslow Fiske. 1986. Introduction: Uneasy Social Science. In Metatheory in Social Science. Pluralisms and Subjectivities, ed. Donald W. Fiske, Richard A. Shweder, 1–18. Chicago: University of Chicago Press.Google Scholar
  53. Simmons, Joseph P., Leif D. Nelson, and Uri Simonsohn. 2011. False-Positive Psychology Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science 22(11):1359–1366.  https://doi.org/10.1177/0956797611417632.CrossRefGoogle Scholar
  54. Simonsohn, Uri, Joseph P. Simmons, and Leif D. Nelson. 2015. Specification Curve: Descriptive and Inferential Statistics on All Reasonable Specifications. SSRN Electronic Journal.  https://doi.org/10.2139/ssrn.2694998.CrossRefGoogle Scholar
  55. Spada, Paolo, and Matt Ryan. 2017. The Failure to Examine Failures in Democratic Innovation. Political Science & Politics 50(3):772–778. https://doi.org/10.1017/S1049096517000579. CrossRefGoogle Scholar
  56. Sparrow, Betsy. 2018. The importance of contextual relevance. Nature Human Behaviour 2:607.  https://doi.org/10.1038/s41562-018-0411-7.CrossRefGoogle Scholar
  57. Stark, Philip B. 2018. Before reproducibility must come preproducibility. Nature 557:613.CrossRefGoogle Scholar
  58. Steegen, Sara, Francis Tuerlinckx, Andrew Gelman, and Wolf Vanpaemel. 2016. Increasing Transparency Through a Multiverse Analysis. Perspectives on Psychological Science 11(5):702–712.  https://doi.org/10.1177/1745691616658637.CrossRefGoogle Scholar
  59. Stockemer, Daniel, Sebastian Koehler, and Tobias Lenz. 2018. Data Access, Transparency, and Replication: New Insights from the Political Behavior Literature. PS: Political Science & Politics 51(4):799–803. https://doi.org/10.1017/S1049096518000926. CrossRefGoogle Scholar
  60. Tenopir, Carol, Suzie Allard, Kimberly Douglass, Arsev Umur Aydinoglu, Lei Wu, Eleanor Read, Maribeth Manoff, and Mike Frame. 2011. Data Sharing by Scientists: Practices and Perceptions. PLoS ONE 6(6):e21101.  https://doi.org/10.1371/journal.pone.0021101 CrossRefGoogle Scholar
  61. Vadillo, Miguel A., Natalie Gold, and Magda Osman. 2018. Searching for the bottom of the ego well: failure to uncover ego depletion in Many Labs 3. Royal Society Open Science 5(8):180390.  https://doi.org/10.1098/rsos.180390.CrossRefGoogle Scholar
  62. de Vries, Ymkje, Annelieke Roest, Peter de Jonge, Pim Cuijpers, Marcus Munafò, and Brian Nosek, George Alter, George Banks, Denny Borsboom, Sara Bowman, Steven Breckler, Stuart Buck, Christopher Chambers, Gilbert Chin, Garret Christensen, Monica Contestabile, Allan Dafoe, Eric Eich, Jeremy Freese, Rachel Glennerster, Daniel Goroff, Donald Green, Bradford Hesse, Macartan Humphreys, John Ishiyama, Dean Karlan, Alan Kraut, Arthur Lupia, Patricia Mabry, Temina Madon, Neil Malhotra, Evan Mayo-Wilson, Marcia McNutt, Edward Miguel, Elizabeth Paluck, Uri Simonsohn, Courtney Soderberg, Barbara Spellman, James Turitto, Gary VandenBos, Simine Vazire, Eric-Jan Wagenmakers, Rick Wilson, and Tal Yarkoni. 2015. Promoting an open research culture. Author guidelines for journals could help to promote transparency, openness, and reproducibility. Science 348:1422–1425.CrossRefGoogle Scholar
  63. de Vries, Ymkje, Annelieke Roest, Peter de Jonge, Pim Cuijpers, Marcus Munafò, and Jojanneke Bastiaansen. 2018. The cumulative effect of reporting and citation biases on the apparent efficacy of treatments: The case of depression. Psychological Medicine.  https://doi.org/10.1017/s0033291718001873.CrossRefGoogle Scholar
  64. Wagenmakers, E.-J., Ruud Wetzels, Denny Borsboom, Han L. J. van der Maas, and Rogier A. Kievit. 2012. An Agenda for Purely Confirmatory Research. Perspectives on Psychological Science 7(6):632–638.  https://doi.org/10.1177/1745691612463078.CrossRefGoogle Scholar
  65. Washburn, Anthony N., Brittany E. Hanson, Matt Motyl, Linda J. Skitka, Caitlyn Yantis, Kendal M. Wong, Jiaqing Sun, J. P. Prims, Allison B. Mueller, Zachary J. Melton, and Timothy S. Carsel. 2018. Why Do Some Psychology Researchers Resist Adopting Proposed Reforms to Research Practices? A Description of Researchers’ Rationales. Advances in Methods and Practices in Psychological Science 1(2):166–173.  https://doi.org/10.1177/2515245918757427.CrossRefGoogle Scholar
  66. Weston, Sara J., and Marjan Bakker. 2018. Preregistration hack-a-shop. Open Science Framework. https://osf.io/vjdwm/. Accessed 15 October 2018.Google Scholar
  67. Weston, Sara J., David Mellor, Marjan Bakker, Olmo van den Akker, Lorne Campbell, Stuart J. Ritchie, William J. Chopik, Rodica I. Damian, Jessica Kosie, and Courtney K. Soderberg. 2018a. Secondary data preregistration. Open Science Framework. https://osf.io/x4gzt/. Accessed 14 October 2018.Google Scholar
  68. Weston, Sara J., Stuart J. Ritchie, Julia M. Rohrer, and Andrew K. Przybylski. 2018b. Recommendations for increasing the transparency of analysis of pre-existing datasets. Preprint PsyArXiv. https://psyarxiv.com/zmt3q/download?format=pdf. Accessed 15 October 2018.Google Scholar
  69. Wilcox, Rand R. 2017. Introduction to Robust Estimation and Hypothesis Testing. A volume in Statistical Modeling and Decision Science. Amsterdam: Elsevier.Google Scholar
  70. Zigerell, L. J. 2017. Reducing Political Bias in Political Science Estimates. PS: Political Science & Politics 50(1):179–183.  https://doi.org/10.1017/S1049096516002389.CrossRefGoogle Scholar

Copyright information

© Deutsche Vereinigung für Politikwissenschaft 2018

Authors and Affiliations

  1. 1.Universität MannheimMannheimGermany

Personalised recommendations