The German Academy of Sciences—Leopoldina—assessed the state of public health research in Germany (Leopoldina 2015). The highly welcomed report identified the need to strengthen structures, education and research in public health (Anonymous 2015). The range of weaknesses identified for Germany can be generalized to other countries, including neighbouring Switzerland and Austria. Strengthening the infrastructure and funding schemes for public health research is necessary wherever public health sciences have not received the level of attention it gets elsewhere (e.g. the US or UK). Given the importance of the Leopoldina report, IJPH will soon publish invited commentaries. This editorial looks, however, at the bibliometric study Leopoldina commissioned as an input into the main assessment (Donner et al. 2014). The bibliometric study is misleading to a degree that warrants public debate.

The bibliometry was used to assess the scientific output (2000–2012) of public health research institutions in Germany. Though public health research adopts a broad range of research methods, epidemiology is very often at the heart of public health sciences. Accordingly, the bibliometric study included both “public health” and “epidemiology”. Instead of measuring the scientific output of the German epidemiology and/or public health research community directly, the study defined a list of journals considered to reflect “public health” (N = 156) and/or “epidemiology” (N = 76) and looked at the number of publications therein. However, the list failed to include the highest ranking journals where public health scientists and epidemiologists place their most important work, such as, e.g., the Lancet, NEJM, BMJ, JAMA, Nature Genetics, and the top-ranking disease-oriented medical journals where epidemiologic and/or public health research gets published on a regular basis. Even journals from one of the All Science Journal Classification’s (ASJC) classic public health categories, “Public, Environmental and Occupational Health” were partly omitted, thereby overlooking some of the highest ranking journals. The consequence of the methodological decision is immediately evident in the report summary table, which also appears as an Annex in the full Leopoldina report (Leopoldina 2015), listing the “top ten” most productive German institutions. The highest ranking institution published 154 articles during the 13-year study period, while the 10th placed institution published 73.

Public health researchers immediately understand that these numbers are impossibly low, reflecting approximately the output of an average researcher rather than an entire institution. However, scientists and decision makers from other fields may not realize that the report grossly failed to meet its objective. As in all sciences, the methodology is a crucial determinant of failure or success; embarrassed by these misleading numbers, I decided to evaluate the method in three ways.

First, as a public health researcher in environmental epidemiology for the last 25 years, I evaluated the list from an environmental health perspective—one of the classic pillars of public health science. Very early in my career, an epidemiologic study published ground-breaking work in my field: the Harvard Six-Cities-Study reported the long-term effects of air pollution on mortality (Dockery et al. 1993). Neither this article of the Harvard School of Public Health—published in the NEJM and cited 3875 times (as of Sep 5, 2015)—nor the vast majority of publications from this seminal epidemiologic study would have been counted as “public health” or “epidemiology” in the German report.

Second, I looked at my last 50 publications and realized that only 14 % would have been considered “public health” or “epidemiology”. I then analysed my publications from 2000 to 2012 years fully dedicated to public health and very frequently pure “epidemiology”. Only 19 % of those articles would qualify as “output”. Less than 20 % of all journals in which I used to publish were actually included in the list of the German assessment. At this point, I was grateful that the study did not include Swiss institutions.

Third, I selected one single scientist affiliated with the Helmholtz Centre—ranked 10th in the list of productive German institutions (73 articles). My search in Thomson Reuters (14.8.2015), combining only the author’s name and “Helmholtz” as the “Address”, returned 190 publications (category “Articles” or “Reviews”) over the same 13 years. In total, 3 % of her 190 articles—mostly public health oriented, if not high-profile epidemiology—got published in the journals included in the German report. For an epidemiologist, a false-negative rate of 97 % is hard to swallow. This is sufficient evidence for me to conclude that the bibliometric study adopted a flawed method, thus any inference about the “output” of these German institutions is meaningless.

The bibliometric study ignores how epidemiology and public health research are organized. Public health is a highly interdisciplinary field, based on a range of scientific methods including not only epidemiology but also a range of “soft” (e.g. health communication, economics, policy, management) and “hard” sciences (e.g. public health genomics, molecular epidemiology or exposure sciences). It is a challenge to properly define its borders, bibliometric space, and place within the disciplinary organization of academia. Instead of taking a narrow look under a single lamp post, bibliometry must face this challenge and look into the forest of bright lights. The report’s definition of what constitutes output in public health science and epidemiology is so restrictive that not much was left to evaluate.

In conclusion, I maintain the hypothesis that German public health research output is not significantly different from other comparable countries that are also in need of better infrastructure and funding instruments in this highly relevant field of science (Leopoldina 2015). The bibliometric study calls for a defence of epidemiology and public health sciences—not only in Germany, but globally. Bibliometric methods must comply with basic standards just like any other science. Else, instead of ridiculing the output of public health research and epidemiology, one may need to dump bibliometry.