Skip to main content

The Scientometric Bubble Considered Harmful


This article deals with a modern disease of academic science that consists of an enormous increase in the number of scientific publications without a corresponding advance of knowledge. Findings are sliced as thin as salami and submitted to different journals to produce more papers. If we consider academic papers as a kind of scientific ‘currency’ that is backed by gold bullion in the central bank of ‘true’ science, then we are witnessing an article-inflation phenomenon, a scientometric bubble that is most harmful for science and promotes an unethical and antiscientific culture among researchers. The main problem behind the scenes is that the impact factor is used as a proxy for quality. Therefore, not only for convenience, but also based on ethical principles of scientific research, we adhere to the San Francisco Declaration on Research Assessment when it emphasizes “the need to eliminate the use of journal-based metrics in funding, appointment and promotion considerations; and the need to assess research on its own merits rather on the journal in which the research is published”. Our message is mainly addressed to the funding agencies and universities that award tenures or grants and manage research programmes, especially in developing countries. The message is also addressed to well-established scientists who have the power to change things when they participate in committees for grants and jobs.

This is a preview of subscription content, access via your institution.


  1. Adler, R., Ewing, J., & Taylor, P. (2009a). Citation statistics. A report from the International Mathematical Union (IMU) in cooperation with the International Council of Industrial and Applied Mathematics (ICIAM) and the Institute of Mathematical Statistics (IMS). Statistical Science, 24(1), 1–14.

    Article  Google Scholar 

  2. Adler, R., Ewing, J., & Taylor, P. (2009b). Rejoinder: Citation statistics. Statistical Science, 24(1), 27–28.

    Article  Google Scholar 

  3. Barabási, A.-L., & Albert, R. (1999). Emergence of scaling in random networks. Science, 286(5439), 509–512.

    Article  Google Scholar 

  4. Brembs, B., Button, K., & Munafò, M. (2013). Deep impact: Unintended consequences of journal rank. Frontiers in Human Neuroscience, 7(291), 1–12.

    Google Scholar 

  5. Broad, W. J. (1981). The publishing game: Getting more for less. Science, 211(4487), 1137–1139.

    Article  Google Scholar 

  6. Cameron, W. B. (1963). Informal sociology: A casual introduction to sociological thinking. New York: Random House.

    Google Scholar 

  7. DORA. (2012). San Francisco declaration on research assessment. Accessed 23 Oct 2014.

  8. Hall, P. G. (2009). Comment on citation statistics. Statistical Science, 24(1), 25–26.

    Article  Google Scholar 

  9. Larsen, P. O., & Ins, M. V. (2010). The rate of growth in scientific publication and the decline in coverage provided by science citation index. Scientometrics, 84(3), 575–603.

    Article  Google Scholar 

  10. Lawrence, P. A. (2003). The politics of publication. Nature, 422(6929), 259–261.

    Article  Google Scholar 

  11. Mackay, A. L. (1977). The harvest of a quiet eye: A selection of scientific quotations. Bristol: Institute of Physics.

    Google Scholar 

  12. Mattern, F. (2008). Bibliometric evaluation of computer science: Problems and pitfalls. European Computer Science SummitECSS 2008. Accessed 23 Oct 2014.

  13. Merton, R. K. (1968). The Matthew effect in science. Science, 159(3810), 56–63.

    Article  Google Scholar 

  14. Moro, E. (2009). Publish and… perish. Accessed 23 Oct 2014.

  15. Nature Materials (2005). Editorial. The cost of salami slicing. Nature Materials 4, 1, 2005.

  16. Parnas, D. L. (2007). Stop the numbers game. Counting papers slows the rate of scientific progress. Communications of the ACM, 50(11), 19–21.

    Article  Google Scholar 

  17. Redner, S. (2005). Citation statistics from 110 Years of physical review. Physics Today, 58(6), 49–54.

    Article  Google Scholar 

  18. Reinach, F. (2013). Darwin e a prática da ‘Salami Science’.,darwin-e-a-pratica-da-salami-science-imp-,1026037. Accessed 23 Oct 2014.

  19. Silverman, B. W. (2009). Bibliometrics in the context of the UK research assessment exercise. Statistical Science, 24(1), 15–16.

    Article  Google Scholar 

  20. The Slow Science Academy. (2010). The Slow Science Manifesto. Accessed 23 Oct 2014.

  21. Thomson Reuters. (2013). Thomson Reuters statement regarding the San Francisco declaration on research assessment. Accessed 23 Oct 2014.

  22. Tipple, C. (1990). Reactions from a CEO. In C.T. Fitz-Gibbon (Ed.), Performance indicators, BERA Dialogues 2, 1990.

  23. Toffler, A. (1970). Future shock. New York: Random House.

    Google Scholar 

  24. Waters, L. (2005). Enemies of promise: Publishing, perishing, and the eclipse of scholarship. Chicago: Prickly Paradigm Press. Second printing.

    Google Scholar 

Download references


Partially funded by project PMI USA1204, Centro de Innovación en Tecnologías de la Información para Aplicaciones Sociales, Universidad de Santiago de Chile, Chile.

Author information



Corresponding author

Correspondence to Gonzalo Génova.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Génova, G., Astudillo, H. & Fraga, A. The Scientometric Bubble Considered Harmful. Sci Eng Ethics 22, 227–235 (2016).

Download citation


  • Ethics in scientific publications
  • Careers in Academia
  • Research Assessment
  • Scientometrics
  • Impact factor