Grounds for Ambiguity: Justifiable Bases for Engaging in Questionable Research Practices

  • Donald F. SaccoEmail author
  • Mitch Brown
  • Samuel V. Bruton
Original Paper


The current study sought to determine research scientists’ sensitivity to various justifications for engaging in behaviors typically considered to be questionable research practices (QRPs) by asking them to evaluate the appropriateness and ethical defensibility of each. Utilizing a within-subjects design, 107 National Institutes of Health principal investigators responded to an invitation to complete an online survey in which they read a series of research behaviors determined, in prior research, to either be ambiguous or unambiguous in their ethical defensibility. Additionally, each behavior was paired with either an ostensibly sound or unsound reason for the behavior. Consistent with hypotheses, the results indicated that scientists perceived QRPs as more appropriate and defensible when paired with a justifiable motive relative to when paired with a clearly unethical motive, particularly for QRPs that are more ambiguous in their ethicality. In fact, ambiguous QRPs were perceived as categorically defensible when given a justifiable motive. This suggests scientists are sensitive to contextual factors related to QRPs’ appropriateness, which could inform how institutions develop appropriate training modules for research integrity.


Questionable research practices Ethics Integrity Motives 



The authors disclose that this research was funded by grants awarded to the first and third author from the Department of Health and Human Services’ Office of Research Integrity (Grant Nos. 1 ORIIR170035-01-00 and 1 ORIIR160021-01-00).


  1. Anderson, M. S., Horn, A. S., Risbey, K. R., Ronning, E. A., De Vries, R., & Martinson, B. C. (2007). What do mentoring and training in the responsible conduct of research have to do with scientists’ misbehavior? Findings from a national survey of NIH-funded scientists. Academic Medicine, 82, 853–860.CrossRefGoogle Scholar
  2. Cronbach, L. J., & Meehl, P. E. (1955). Construct validity in psychological tests. Psychological Bulletin, 52(4), 281–302.CrossRefGoogle Scholar
  3. Fanelli, D. (2010). Do pressures to publish increase scientists’ bias? An empirical support from US States Data. PLoS ONE, 5, e10271.CrossRefGoogle Scholar
  4. Fanelli, D. (2011). Negative results are disappearing from most disciplines and countries. Scientometrics, 90, 891–904.CrossRefGoogle Scholar
  5. Fanelli, D., Costas, R., & Larivière, V. (2015). Misconduct policies, academic culture and career stage, not gender or pressures to publish, affect scientific integrity. PLoS ONE, 10, e0127556.CrossRefGoogle Scholar
  6. Festinger, L., & Carlsmith, J. M. (1959). Cognitive consequences of forced compliance. The Journal of Abnormal and Social Psychology, 58, 203–210.CrossRefGoogle Scholar
  7. Franco, A., Malhotra, N., & Simonovits, G. (2014). Publication bias in the social sciences: Unlocking the file drawer. Science, 345, 1502–1505.CrossRefGoogle Scholar
  8. Goldstein, N. J., Griskevicius, V., & Cialdini, R. B. (2007). Invoking social norms: A social psychology perspective on improving hotels’ linen-reuse programs. Cornell Hotel and Restaurant Administration Quarterly, 48, 145–150.CrossRefGoogle Scholar
  9. Ioannidis, J. P. (2012). Why science is not necessarily self-correcting. Perspectives on Psychological Science, 7, 645–654.CrossRefGoogle Scholar
  10. John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 5, 524–532.CrossRefGoogle Scholar
  11. LaRose, R., & Kim, J. (2006). Share, steal, or buy? A social cognitive perspective of music downloading. CyberPsychology & Behavior, 10, 267–277.CrossRefGoogle Scholar
  12. Leary, M. R., & Kowalski, R. M. (1990). Impression management: A literature review and two-component model. Psychological Bulletin, 107, 34–47.CrossRefGoogle Scholar
  13. Loken, E., & Gelman, A. (2017). Measurement error and the replication crisis. Science, 355, 584–585.CrossRefGoogle Scholar
  14. Lynott, D., Corker, K. S., Wortman, J., Connell, L., Donnellan, M. B., Lucas, R. E., et al. (2014). Replication of “Experiencing physical warmth promotes interpersonal warmth” by Williams and Bargh (2008). Social Psychology, 45, 216–222.CrossRefGoogle Scholar
  15. May, H. (2012). Nonequivalent comparison group designs. In H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology, Vol 2: Research designs—Quantitative, qualitative, neuropsychological, and biological (pp. 489–509). Washington, DC: American Psychological Association.CrossRefGoogle Scholar
  16. National Academies of Science, Engineering, and Medicine. (2017). U.S. scientific research enterprise should take action to protect integrity in research; New advisory board on research integrity should be established. Accessed 23 July 2018.
  17. Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 34, aac4716.CrossRefGoogle Scholar
  18. Paulhus, D. L. (2002). Socially desirable responding: The evolution of a construct. The Role of Constructs in Psychological and Educational Measurement, 4969, 49–69.Google Scholar
  19. Prelec, D. (2004). A Bayesian truth serum for subjective data. Science, 306, 462–466.CrossRefGoogle Scholar
  20. Ramalingam, S., Bhuvaneswari, S., & Sankaran, R. (2014). Ethics workshops-are they effective in improving the competencies of faculty and postgraduates? Journal of Clinical and Diagnostic Research: JCDR, 8, XC01.Google Scholar
  21. Ritchie, S. J., Wiseman, R., & French, C. C. (2012). Failing the future: Three unsuccessful attempts to replicate Bem’s ‘Retroactive Facilitation of Recall’ effect. PLoS ONE, 7, e33423.CrossRefGoogle Scholar
  22. Roig, M. (2015). Avoiding plagiarism, self-plagiarism, and other questionable writing practices: A guide to ethical writing. Office of Research Integrity. Accessed 7 Mar 2018.
  23. Sacco, D. F., Bruton, S. V., & Brown, M. (2018). In defense of the questionable: Defining the basis of research scientists’ engagement in questionable research practices. Journal of Empirical Research on Human Research Ethics, 13, 101–110.CrossRefGoogle Scholar
  24. Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359–1366.CrossRefGoogle Scholar
  25. Steele, K. M., Bass, K. E., & Crook, M. D. (1999). The mystery of the Mozart effect: Failure to replicate. Psychological Science, 10, 366–369.CrossRefGoogle Scholar
  26. Tijdink, J. K., Verbeke, R., & Smulders, Y. M. (2014). Publication pressure and scientific misconduct in medical scientists. Journal of Empirical Research on Human Research Ethics, 9, 64–71.CrossRefGoogle Scholar
  27. Todd, E. M., Torrence, B. S., Watts, L. L., Mulhearn, T. J., Connelly, S., & Mumford, M. D. (2017). Effective practices in the delivery of research ethics education: A qualitative review of instructional methods. Accountability in Research, 24, 297–321.CrossRefGoogle Scholar
  28. Watts, L. L., Mulhearn, T. J., Medeiros, K. E., Steele, L. M., Connelly, S., & Mumford, M. D. (2017). Modeling the instructional effectiveness of responsible conduct of research education: A meta-analytic path-analysis. Ethics and Behavior, 27, 632–650.CrossRefGoogle Scholar
  29. Wicherts, J. M., Veldkamp, C. L., Augusteijn, H. E., Bakker, M., Van Aert, R., & Van Assen, M. A. (2016). Degrees of freedom in planning, running, analyzing, and reporting psychological studies: A checklist to avoid p-hacking. Frontiers in Psychology, 7, 1832.CrossRefGoogle Scholar

Copyright information

© Springer Nature B.V. 2018

Authors and Affiliations

  1. 1.School of PsychologyThe University of Southern MississippiHattiesburgUSA
  2. 2.School of HumanitiesThe University of Southern MississippiHattiesburgUSA

Personalised recommendations