Science and Engineering Ethics

, Volume 21, Issue 6, pp 1509–1523 | Cite as

Variations in Scientific Data Production: What Can We Learn from #Overlyhonestmethods?

  • Louise BezuidenhoutEmail author
Original Paper


In recent months months the hashtag #overlyhonestmethods has steadily been gaining popularity. Posts under this hashtag—presumably by scientists—detail aspects of daily scientific research that differ considerably from the idealized interpretation of scientific experimentation as standardized, objective and reproducible. Over and above its entertainment value, the popularity of this hashtag raises two important points for those who study both science and scientists. Firstly, the posts highlight that the generation of data through experimentation is often far less standardized than is commonly assumed. Secondly, the popularity of the hashtag together with its relatively blasé reception by the scientific community reveal that the actions reported in the tweets are far from shocking and indeed may be considered just “part of scientific research”. Such observations give considerable pause for thought, and suggest that current conceptions of data might be limited by failing to recognize this “inherent variability” within the actions of generation—and thus within data themselves. Is it possible, we must ask, that epistemic virtues such as standardization, consistency, reportability and reproducibility need to be reevaluated? Such considerations are, of course, of particular importance to data sharing discussions and the Open Data movement. This paper suggests that the notion of a “moral professionalism” for data generation and sharing needs to be considered in more detail if the inherent variability of data are to be addressed in any meaningful manner.


#Overlyhonestmethods Research methods Tacit knowledge Moral professionalism Data sharing Scientific data Open data 



Many thanks to Prof Brian Rappert, Dr Ann-Sophie Barwich and Ms Helena van der Vegt for their invaluable comments on this manuscript and subject. I also thank the two anonymous reviewers for their helpful contributions to the manuscript.


  1. Begley, C. G., & Ellis, L. M. (2012). Drug development: Raise standards for preclinical cancer research. Nature, 483(7391), 531–533.CrossRefGoogle Scholar
  2. Ben-David, J., & Sullivan, T. A. (1975). Sociology of science. Annual Review of Sociology, 1(1), 203–222.CrossRefGoogle Scholar
  3. CODATA, US National Committee for. (1997). Bits of power: Issues in global access to scientific data. Washington DC: National Academies Press.Google Scholar
  4. Collins, H. M. (2001). Tacit knowledge, trust and the Q of Sapphire. Social Studies of Science, 31(1), 71–85.CrossRefGoogle Scholar
  5. Hayden, E. C. (2013). Weak statistical standards implicated in scientific irreproducibility. Nature, November 11, 2013.Google Scholar
  6. Jones, N. L. (2007). A code of ethics for the life sciences. Science and Engineering Ethics, 13, 25–43.Google Scholar
  7. Knoppers, B. M., Harris, J. R., Tasse, A. M., Budin-Ljøsne, I., Kaye, J., Deschenes, M., & Zawati, M. H. (2011). Towards a data sharing code of conduct for international genomic research. Genome Medicine, 3, 46.Google Scholar
  8. Kuhn, T. (1962). The structure of scientific revolutions. Chicago: University of Chicago Press.Google Scholar
  9. Mobley, A., Linder, S. K., Braeuer, R., Ellis, L. M., & Zwelling, L. (2013). A survey on data reproducibility in cancer research provides insights into our limited ability to translate findings from the laboratory to the clinic. PLoS One, 8(5), e63221.CrossRefGoogle Scholar
  10. Mole, B. M. (2013). Overly honest methods, The Scientist, January 10, 2013. Accessed 17 Dec 2014.
  11. O’Malley, M., Elliott, K., & Burian, R. (2010). From genetic to genomic regulation: Iterative methods in miRNA research. Studies in History and Philosophy of Biological and Biomedical Sciences, 41, 407–417.CrossRefGoogle Scholar
  12. Ruben, A. (2014). Forgive me, scientists, for I have sinned. Science, May 20, 2014.
  13. Smith, R. (2004). Scientific articles have hardly changed in 50 years. BMJ, 328, 1533.CrossRefGoogle Scholar
  14. Vasilevsky, N. A., Brush, M. H., Paddock, H., Ponting, L., Tripathy, S. J., et al. (2013). On the reproducibility of science: Unique identification of research resources in the biomedical literature. PeerJ, 1, e148.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  1. 1.Department of Sociology, Philosophy and AnthropologyUniversity of ExeterExeterUK
  2. 2.Steve Biko Centre for BioethicsUniversity of WitwatersrandJohannesburgSouth Africa

Personalised recommendations