A Systems Approach to Understanding and Improving Research Integrity
- 324 Downloads
Concern about the integrity of empirical research has arisen in recent years in the light of studies showing the vast majority of publications in academic journals report positive results, many of these results are false and cannot be replicated, and many positive results are the product of data dredging and the application of flexible data analysis practices coupled with selective reporting. While a number of potential solutions have been proposed, the effects of these are poorly understood and empirical evaluation of each would take many years. We propose that methods from the systems sciences be used to assess the effects, both positive and negative, of proposed solutions to the problem of declining research integrity such as study registration, Registered Reports, and open access to methods and data. In order to illustrate the potential application of systems science methods to the study of research integrity, we describe three broad types of models: one built on the characteristics of specific academic disciplines; one a diffusion of research norms model conceptualizing researchers as susceptible, “infected” and recovered; and one conceptualizing publications as a product produced by an industry comprised of academics who respond to incentives and disincentives.
KeywordsSystems thinking System dynamics Research ethics Publish or perish Open data Registered reports
Compliance with Ethical Standards
Conflict of interest
The authors declare that they have no conflict of interest.
- American Statistical Association. (2016). ASA statement on statistical significance and p-values. The American Statistician, 70, 131–133.Google Scholar
- Anderson, C. J., Bahnik, S., Barnett-Cowan, M., Bosco, F. A., Chandler, J., Chartier, C.R. et al. (2015). Response to Comment on “Estimating the reproducibility of psychological science.” Science, 351, 1037-c. doi: 10.1126/science.aad9163.
- Benjamin, D. J., Berger, J. O., Johannesson, M., Nosek, B. A., Waggenmakers, E.-J., Berk, R., et al. (2017). Redefine statistical significance. PsyArXiv Preprints, https://psyarxiv.com/mky9j. doi: 10.17605/OSF.IO/MKY9 J.
- Best, A., Clark, P. I., Leischow, S. J., & Trochim, W. M. K. (2007). Greater than the Sum: Systems Thinking in Tobacco Control. National Cancer Institute, U.S. Department of Health and Human Services, National Institutes of Health.Google Scholar
- BioMed Central (2016). Publish your study protocol. Retrieved from http://old.biomedcentral.com/authors/protocol.
- Center for Open Science. Registered Reports: Peer review before results are known to align scientific values and practices. Journals that have adopted Registered Reports. https://cos.io/rr/?_ga=1.126554573.139903688.1493654853 Accessed September 6, 2017.
- ClinicalTrials.gov (2017). Disclaimer. https://clinicaltrials.gov/ct2/about-site/disclaimer. Accessed September 4, 2017.
- Coyne, J. C., & de Voogd, J. N. (2012). Are we witnessing the decline effect in the Type D personality literature? What can be learned? Journal of Psychosomatic Research, 73, 40107.Google Scholar
- Elkins, A. D., & Gorman, D. M. (2014). Systems theory in public health. In D. McQueen (Ed.) Oxford Bibliographies in Public Health. New York: Oxford University Press.Google Scholar
- Fanelli, D. (2012). “Positive” results increase down the hierarchy of science. PLoS ONE, 4(5), e10068.Google Scholar
- Fucci, D., Scanniello, G., Ramano, S., Shepperd, M., Sigweni, B., Uyaguari, F., et al. (2016). An external replication on the effects of test-driven development using a multi-site blind analysis approach. In ESEM’16: Proceedings of the 10 th ACM/IEEE International Symposium on Empirical Software and Measurement. Article No. 3 http://people.brunel.ac.uk/~csstmms/FucciEtAl_ESEM2016.pdf.
- Gilbert, D. T., King. G., Pettigrew, S., & Wilson, T. D. (2015). Comment on “Estimating the reproducibility of psychological science.” Science, 351, 1037–b. doi: 10.1126/science.aad7243.
- Goodman, S. N., Fanelli, D., & Ioannidis, J. P. A. (2016). What does research reproducibility mean? Science Translational Medicine, 8, 341, ps12.Google Scholar
- Gorman, D. M. (2016). Can we trust positive findings of intervention research? The role of conflict of interest. Prevention Science. April 23 (Epub ahead of print).Google Scholar
- Gorman, D. M. (2017c). Evidence-based practice as a driver of pseudoscience in prevention research. In A. B. Kaufman & J. Kaufman (Eds.), Pseudoscience. Cambridge: MIT Press.Google Scholar
- Hay, M., Andrews, M., Wilson, R., Callender, D., O’Malley, P. G., & Douglas, K. (2016). Reporting quality of randomized controlled abstracts among high-impact general medical journals: A review and analysis. British Medical Journal Open, 6(7), e011082. doi: 10.1136/bmjopen-2016-011082.Google Scholar
- Ioannidis, J. P. A. (2008). Why most published research findings are false. PLoS Medicine, 2, e124 (0696–0701).Google Scholar
- Kücük, B., Güler, N, & Eskici, B. (2008). A dynamic simulation model of academic publications and citations. In Proceedings of the 26th International Conference of the System Dynamics Society. Athens, Greece. Retrieved from https://www.systemdynamics.org/conferences/2008/proceed/papers/KUCUK339.pdf.
- Laura & John Arnold Foundation. (2016). Grants. Retrieved September 5, 2017 from http://www.arnoldfoundation.org/grants/.
- MacCoun, R., & Perlmutter, S. (2017). Blind analysis as a corrective for confirmatory biasin physics and psychology. In S. O. Lilienfeld & I. Waldman (Eds.), Psychological Science under Scrutiny: Recent Challenges and Proposed Solutions. Wiley-Blackwell: Hoboken.Google Scholar
- Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific Utopia II. Restructuring incentives and practices to promote truth over publishability. Psychological Science, 7, 615–631.Google Scholar
- Nuzzo, R. (2015). Fooling ourselves. Science, 526, 182–185.Google Scholar
- Open Science Collaboration (2015). Estimating the reproducibility of psychological science. Science, 349, acac4716.Google Scholar
- Satpute, S., Mehta, M., Bhete, S., & Kurle, D. (2016). Assessment of adherence to the statistical components of Consolidated Standards of Reporting Trials Statement for quality of reports on randomized controlled trials from five pharmacology journals. Perspectives in Clinical Research, 7, 128–131.CrossRefGoogle Scholar
- Sterman, J. D. (2000). Business Dynamics: Systems Thinking and Modeling for a Complex World. Boston: Irwin/McGraw Hill.Google Scholar
- Walker, K. F., Stevenson, G., & Thornton, J. G. (2014). Discrepencies between registration and publication of randomised controlled trias: An observational study. Journal of the Royal Society of Medicine Open, 5(5), 1–4.Google Scholar