Skip to main content
Log in

Assessing the effects of the German Excellence Initiative with bibliometric methods

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

The German Excellence Initiative started in 2006 as a public funding program of crucial importance for German universities. Since this time, several studies on different aspects of the program have been conducted, but there have been no analyses using bibliometric methods to measure the direct effects of funding—which is apparently due to the fact that publications resulting from the funding program are not publicly and comprehensively documented. This paper uses the concept of highly cited publications to measure excellent research, and explores two methodological approaches in order to attribute highly cited publications to the Excellence Initiative. To this end, the paper focuses on publications produced by the clusters of excellence (CoEs). The clusters of excellence constitute only one of three funding lines, but receive 60 % of the total funding of the excellence program and form the core research units of the Excellence Initiative. The highly cited publications of the CoEs are identified via self-selected lists of publications in the CoE renewal proposals and via a funding acknowledgement analysis. The validity of both data sources is analyzed comparatively. Based on the objectives of the Excellence Initiative, its effects on the level of funded clusters, universities and the overall German research system are explored. The bibliometric analysis gives evidence that the funding program has succeeded in concentrating excellent research and fostering collaborations between universities and the non-university research sector, but has not caused massive changes to the overall German research system.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. http://www.shanghairanking.com/; https://www.timeshighereducation.com/; http://www.qs.com/.

  2. We use the common German abbreviations of the official names of the non-university research organizations: Max Planck Gesellschaft (MPG), Helmholtz-Gemeinschaft Deutscher Forschungszentren (HGF), Wissenschaftsgemeinschaft Gottfried Wilhelm Leibniz (WGL) and Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung (FHG).

  3. In a wider view and without a special focus on the Excellence Initiative, bibliometric evaluations of coordinated grants programs on the level of national science systems have been carried out in different countries, see for example Aagaard et al. (2015) and Moed (2008).

  4. Some of the findings of this paper were published in two reports both in German (Hornbostel and Möller 2015; Möller 2016).

  5. These have been calculated on the basis of the Oracle percentile_cont function.

  6. Our general worldwide baseline differs slightly between the years of investigation (9.93–10.16 %). Besides, we do not apply an adjustment factor as has been used in the initial CWTS procedure with the aim to neutralize possible biases in inter-field comparisons resulting from differing amounts of tied publications between subject categories. This factor has been criticized by Waltman & Schreiber (2013) as conceptually problematic. As we do not compare e.g. research groups with differing profiles but focus on higher aggregation levels, no problematic biases are to be expected.

  7. All calculations have been processed on the database of the German Competence Center for Bibliometrics, funded by the Federal Ministry of Education and Research (Grant Number 01PQ13001). The database contains the tagged data of Science Citation Index Expanded, Scientific and Technical Proceedings, Social Sciences Citation, Arts and Humanities Citation Index, Social Sciences and Humanities Proceedings, frozen in the 14th calendar week of 2014. Citations are analyzed within a three-year citation window.

  8. As these were not considered to be influencing factors for these methodological questions and may, in the case of the later extraction of funding phrases, deliver a more complete picture.

  9. In contrast to the 30 proposals (one proposal was missing), the funding acknowledgment analysis included all 31 CoEs in engineering, natural and life sciences. In the case of the missing proposal we proceeded by first reviewing that CoE’s publications, found by general phrases, and extracting special phrases from these.

  10. False positives resulted to a large extent from a small number of general phrases where it was not possible to differentiate between the German cluster of excellence and other excellence funding lines.

  11. As one cluster of excellence did not supply its proposal, we can only compare the results of 30 CoE in this section.

  12. Our general worldwide baseline is almost 10 % (see footnote 6). If, however, all country collaborated publications are counted multiply on a worldwide level, the factual baseline is greater than 10 %.

  13. We use the common German abbreviations for non-university research organizations (see footnote 2): Helmholtz Association (HGF), Max Planck Society (MPG), Leibniz Association (WGL) and Fraunhofer Society (FHG).

  14. Developed in the Competence Centre for Bibliometrics for the German Science System (Winterhager et al. 2014).

  15. The abbreviations in Fig. 3 are explained in the text and in Table 3 (“Appendix”). In addition, in Table 3 the numbers of publications for each unit of investigation are given for the pre-funding and the funding period. See Möller (2016, Figure 20 and 22, pp. 31, 32)  and Table 4 for an annual analysis of the units, but without an identification of the CoE publications. For almost all units of investigation a slight increase in the PP (top 10 %) over the years is observable, except for the FHG, which conducts mainly applied research in conjunction with industry partners and has the lowest publication output of all units of investigation (see Table 3). Annual data [P, PP (top 10 %)] for the whole German science system and the university sector is provided in the “Appendix”.

  16. The Karlsruhe Institute of Technology (KIT) has not been considered in the UoE group, as it emerged from a fusion of the University of Karlsruhe and the Karlsruhe Research Centre (HGF) as part of their institutional strategy. KIT is also not included in the HGF group.

  17. Collaboration indicators may be biased to a certain extent by the fact that directors or group leaders of non-university research organizations could be also appointed by universities as professors and use both addresses. Apart from that, a transition of job positions may also lead to several address affiliations of the same person. In order to estimate this amount, we identified all publications in the time period 2009–2011 that feature two addresses—one of a university and one of a non-university research institute—as affiliations of a single author, with the additional constraint that these two addresses technically constitute a collaboration that is not additionally based on other authors of the paper. In the segment of highly cited funding acknowledgement publications, this amounts to round 21 % of all collaborative publications between universities and the non-university research sector (in other segments somewhat higher). We further researched author C.V. information of this corpus segment manually on the internet, which lead to further differentiation: In around one-third of these publications an institutionalized collaboration without any mentioning of CoE in C.V. or staff pages could be positively confirmed, added by a smaller share of less than one tenth publications in which job transitions could be the reason for double affiliations. In the residuum, no hints of certain conjoint university-research organisation appointments in the timeframe could be found or no specific C.V. information or otherwise positive information about the person—which might include cases of former Ph.D. students. We can estimate that less than 10 % of the highly cited co-publications with funding acknowledgements between universities and non-university research organizations feature a formalized institutional collaboration/double affiliation or transition. This does not logically exclude that a real cooperation has taken place besides. In general, these factors have to be supposed for both periods of time (maybe with a slight increase of conjoint appointments).

  18. The rise of the co-publications between universities and the Helmholtz Association (HGF) or the Leibniz Association (WLG), in which CoE play a lesser role, can probably be attributed to the collaborative objectives of the Pact for Research and Innovation or other funding instruments of the non-university research organizations. Two funding instruments of the Helmholtz Association (the Helmholtz Virtual Institutes and the Helmholtz Institutes) are dedicated to the promotion of cooperation with universities: So far, 110 Helmholtz Virtual Institutes have received 126 million euros in funding under the precondition of collaboration with universities. The universities benefit from approximately 66 million euros. The seven Helmholtz Institutes receive 3–5 million euros per year as institutional support (Helmholtz-Gemeinschaft 2015).

  19. In 2011 74.2 % of the German publications have a university address, while 25.6 % have an address from one of the non-university research organizations.

  20. Whole count, therefore multiple assignments are possible.

  21. 67 % of MPG cluster publications derive from collaborations with universities, but the remaining 33 % are authored by MPG authors without additional university addresses (not shown in the figure).

  22. http://www.leidenranking.com/ranking.

  23. The above information is based on the following calculation: In the period from 2008 to 2010 a total of 1.14 billion euros were provided for the Excellence Initiative in the budget of the DFG (Gemeinsame Wissenschaftskonferenz III 2014). The research & development expenditure for all institutions in the same period amounted to 35.585 billion euros (Destatis 2014). The figure of 3.2 % (CoE funding as a percentage of total R&D expenditure in higher education) is thus obtained with the formula 1.14/35.585 = 3.2 %.

References

  • Aagaard, K., Bloch, C., & Schneider, J. W. (2015). Impacts of performance-based research funding systems: The case of the Norwegian Publication Indicator. Research Evaluation, 24(2), 106–117. doi:10.1093/reseval/rvv003.

    Article  Google Scholar 

  • Aksnes, D. W., & Rip, A. (2009). Researchers’ perceptions of citations. Research Policy, 38(6), 895–905.

    Article  Google Scholar 

  • Aksnes, D. W., Schneider, J. W., & Gunnarsson, M. (2012). Ranking national research systems by citation indicators. A comparative analysis using whole and fractionalised counting methods. Journal of Informetrics, 6(1), 36–43. doi:10.1016/j.joi.2011.08.002.

    Article  Google Scholar 

  • Beck, U. (1986). Risikogesellschaft Auf dem Weg in ein andere Moderne. Frankfurt a. M.: Suhrkamp.

    Google Scholar 

  • Belter, C. W. (2013). A bibliometric analysis of NOAA’s Office of Ocean Exploration and Research. Scientometrics, 95(2), 629–644. doi:10.1007/s11192-012-0836-0.

    Article  Google Scholar 

  • Bloch, R., Keller, A., Lottmann, A., & Würmann, C. (2008). Making Excellence: Grundlagen, Praxis und Konsequenzen der Exzellenzinitiative. Bielefeld: Bertelsmann.

    Google Scholar 

  • Bornmann, L. (2014). How are excellent (highly cited) papers defined in bibliometrics? A quantitative analysis of the literature. Research Evaluation, 23(2), 166–173. doi:10.1093/reseval/rvu002.

    Article  Google Scholar 

  • Bornmann, L. (2016). Is the promotion of research reflected in bibliometric data? A network analysis of highly cited papers on the Clusters of Excellence supported under the Excellence Initiative in Germany. Scientometrics, 107(3), 1041–1061. doi:10.1007/s11192-016-1925-2.

    Article  Google Scholar 

  • Bornmann, L., & Leydesdorff, L. (2013). The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000. Journal of Informetrics, 7(2), 286–291. doi:10.1016/j.joi.2012.12.003.

    Article  Google Scholar 

  • Bukow, S., & Möller, T. (2013). Die Rekrutierung wissenschaftlichen Spitzenpersonals in der Exzellenzinitiative. Berlin: iFQ.

  • Destatis (2014). Finanzen und Steuern 2012 (Fachserie 14 Reihe 3.6). Wiesbaden: Statistisches Bundesamt.

    Google Scholar 

  • Deutsche Forschungsgemeinschaft (2015). Förderatlas 2015Kennzahlen zur öffentlich finanzierten Forschung in Deutschland. Bonn: Deutsche Forschungsgemeinschaft.

  • Deutsche Forschungsgemeinschaft DFG (2011). General information on preparing a proposal and template for initial/renewal proposals for a Cluster of Excellence Second Programme Phase 20122017, ExIn304e. Retrieved from http://www.dfg.de/formulare/exin304e/exin304e_rtf.rtf (2014-2-24).

  • Díaz-Faes, A. A., & Bordons, M. (2014). Acknowledgments in scientific publications: Presence in Spanish science and text patterns across disciplines: Acknowledgments in Scientific Publications. Journal of the Association for Information Science and Technology, 65(9), 1834–1849. doi:10.1002/asi.23081.

    Article  Google Scholar 

  • Engels, A., Ruschenberg, T., & Zuber, S. (2012). Chancengleichheit in der Spitzenforschung: Institutionelle Erneuerung der Forschung in der Exzellenzinitiative des Bundes und der Länder. In T. Heinze & G. Krücken (Eds.), Institutionelle Erneuerungsfähigkeit der Forschung (pp. 187–217). Wiesbaden: VS-Verlag.

  • ExV, Exzellenzvereinbarung. (2005). Bund-Länder-Vereinbarung gemäß Artikel 91 b des Grundgesetzes (Forschungsförderung) über die Exzellenzinitiative des Bundes und der Länder zur Förderung von Wissenschaft und Forschung an deutschen Hochschulen. Retrieved from http://www.gwk-bonn.de/fileadmin/Papers/exzellenzvereinbarung.pdf.

  • ExV II. (2009). Exzellenzvereinbarung II: Verwaltungsvereinbarung zwischen Bund und Ländern gemäß Artikel 91b Abs. 1 Nr. 2 des Grundgesetztes über die Fortsetzung der Exzellenzinititaive des Bundes und der Länder zur Förderung von Wissenschaft und Forschung an deutschen Hochschulen. Retrieved from www.gwk-bonn.de/fileadmin/Papers/Exzellenzvereinbarung-II-2009.pdf.

  • Gemeinsame Wissenschaftskonferenz (2005). Pakt für Forschung und Innovation I. Retrieved from http://www.gwk-bonn.de/fileadmin/Papers/pakt_fuer_forschung_und_innovation.pdf.

  • Gemeinsame Wissenschaftskonferenz. (2014). Pakt für Forschung und Innovation III. Retrieved from www.gwk-bonn.de/fileadmin/Papers/PFI-III_2016-2020.pdf.

  • Hartmann, M. (2006). Die Exzellenzinitiative—ein Paradigmenwechsel in der deutschen Hochschulpolitik. Leviathan, 34(4), 447–465.

    Article  Google Scholar 

  • Helmholtz-Gemeinschaft. (2015). Website der Helmholtz-Gemeinschaft, Helmholtz-Zentren und Netzwerke. Retrieved from http://www.helmholtz.de/helmholtz_zentren_netzwerke.

  • Hicks, D., Wouters, P., Waltman, L., De Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520(7548), 429–431. doi:10.1038/520429a.

    Article  Google Scholar 

  • Hornbostel, S., & Möller, T. (2015). Die Exzellenzinitiative und das deutsche Wissenschaftssystem: eine bibliometrische Wirkungsanalyse. Berlin: Berlin-Brandenburgische Akademie der Wissenschaften.

    Google Scholar 

  • Hornbostel, S., Simon, D., & Heise, S. (2008). Exzellente Wissenschaft. Das Problem, der Diskurs, das Programm und die Folgen (Vol. 4). Bonn: iFQ.

  • Internationale Kommission. (1999). Forschungsförderung in Deutschland. Bericht der internationalen Kommission zur Systemevaluation der Deutschen Forschungsgemeinschaft und der Max-Planck-Gesellschaft. Hannover: Volkswagen-Stiftung.

  • Langfeldt, L., Benner, M., Sivertsen, G., Kristiansen, E. H., Aksnes, D. W., Borlaug, S. B., et al. (2015). Excellence and growth dynamics: A comparative study of the Matthew effect. Science and Public Policy, 42(5), 661–675. doi:10.1093/scipol/scu083.

    Article  Google Scholar 

  • Leibfried, S. (2010). Die Exzellenzinitiative Zwischenbilanz und Perspektiven. Frankfurt a. M.: Campus.

    Google Scholar 

  • Leydesdorff, L., Bornmann, L., Mutz, R., & Opthof, T. (2011). Turning the tables on citation analysis one more time: Principles for comparing sets of documents. Journal of the American Society for Information Science and Technology, 62(7), 1370–1381. doi:10.1002/asi.21534.

    Article  Google Scholar 

  • Markova, H. (2013). Exzellenz durch Wettbewerb und Autonomie? Deutungsmuster hochschulpolitischer Eliten am Beispiel der Exzellenzinitiative. Konstanz UVK Verlagsgesellschaft.

    Google Scholar 

  • Mittermaier, B. (2011). Publizieren Spitzen-Unis mehr? duz Magazin 09. Retrieved from http://www.duz.de/duz-magazin/2011/09/publizieren-spitzen-unis-mehr/27.

  • Moed, H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.

    Google Scholar 

  • Moed, H. F. (2008). UK Research Assessment Exercises: Informed judgments on research quality or quantity? Scientometrics, 74(1), 153–161. doi:10.1007/s11192-008-0108-1.

    Article  Google Scholar 

  • Möller, T. (2016). Messung möglicher Auswirkungen der Exzellenzinitiative sowie des Pakts für Forschung und Innovation auf die geförderten Hochschulen und außeruniversitären Forschungseinrichtungen. Berlin: Expertenkommission Forschung und Innovation (EFI). Retrieved from http://www.e-fi.de/fileadmin/Innovationsstudien_2016/StuDIS_09_2016.pdf.

  • Möller, T., Antony, P., Hinze, S., & Hornbostel, S. (2012). Exzellenz begutachtet: Die Befragung der Gutachter in der Exzellenzinitiative (Vol. 11). Berlin: iFQ.

  • Morillo, F., Costas, R., & Bordons, M. (2015). How is credit given to networking centres in their publications? A case study of the Spanish CIBER research structures. Scientometrics, 103(3), 923–938. doi:10.1007/s11192-015-1564-z.

    Article  Google Scholar 

  • Münch, R. (2006). Wissenschaft im Schatten von Kartell, Monopol und Oligarchie. Die latenten Effekte der Exzellenzinitiative. Leviathan, 34(4), 466–486.

    Google Scholar 

  • Münch, R. (2007). Die akdemische Elite Zur sozialen Konstruktion wissenschaftlicher Exzellenz. Frankfurt a. M.: Suhrkamp.

    Google Scholar 

  • Pasternack, P. (2009). Jenseits der Exzellenzinitiative. Alternative Optionen für die ostdeutsche Hochschulentwicklung. Die Hochschule, 1, 142–154.

    Google Scholar 

  • Rigby, J. (2013). Looking for the impact of peer review: Does count of funding acknowledgements really predict research impact? Scientometrics, 94(1), 57–73. doi:10.1007/s11192-012-0779-5.

    Article  Google Scholar 

  • Rigby, J., & Julian, K. (2014). On the horns of a dilemma: does more funding for research lead to more research or a waste of resources that calls for optimization of researcher portfolios? An analysis using funding acknowledgement data. Scientometrics, 101(2), 1067–1075. doi:10.1007/s11192-014-1259-x.

    Article  Google Scholar 

  • Scherb, J. (2012). Lissabon-Strategie (Lissabon-Prozess). Handlexikon der Europäischen Union. Baden-Baden: Nomos.

    Google Scholar 

  • Simon, D. (1991, Dezember 9). Die Universität ist verrottet. Der Spiegel, S. 52–53.

  • Sirtes, D. (2013). Funding acknowledgements for the German Research Foundaten (DFG). The dirty data of the Web of Science database and how to clean it up. In J. Gorraiz, E. Schiebel, C. Gumpenberger, M. Hörlesberger, & H. Moed (Eds.), Proceedings of the International Conference on Scientometrics and Informetrics. Vienna: 14th International Society of Scientometrics and Informetrics Conference (ISSI).

  • Sirtes, D., Riechert, M., Donner, P., Aman, V., & Möller, T. (2015). Funding Acknowledgements in der Web of Science-Datenbank (Studien zum deutschen Innovationssystem Nr. XX-2014). Berlin: Expertenkommission Forschung und Innovation (EFI).

  • Sondermann, M., Simon, D., Scholz, A., & Hornbostel, S. (2008). Die Exzellenzinitiative. Beobachtungen aus der Implementierungsphase (Bd. 5). Bonn: iFQ.

  • Teddlie, C., & Yu, F. (2008). Mixed methods sampling: A typology with examples. In V. L. Plano Clark & J. W. Creswell (Eds.), The Mixed Methods Reader. Los Angeles: Sage.

    Google Scholar 

  • Testa, J. (2009). The globalization of Web of Science 2005–2010. Retrieved from http://wokinfo.com/media/pdf/globalwosessay.pdf.

  • Thomson Reuters. (2016). Funding Acknowledgements. Retrieved from http://wokinfo.com/products_tools/multidisciplinary/webofscience/fundingsearch/.

  • Tijssen, R. J. W., Visser, M. S., & van Leeuwen, T. N. (2002). Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference? Scientometrics, 54(3), 381–397.

    Article  Google Scholar 

  • Waltman, L., Calero-Medina, C., Kosten, J., Noyons, E. C. M., Tijssen, R. J. W., van Eck, N. J., et al. (2012). The Leiden ranking 2011/2012: Data collection, indicators, and interpretation. Journal of the American Society for Information Science and Technology, 63(12), 2419–2432. doi:10.1002/asi.22708.

    Article  Google Scholar 

  • Waltman, L., & Schreiber, M. (2013). On the calculation of percentile-based bibliometric indicators. Journal of the American Society for Information Science and Technology, 64(2), 372–379. doi:10.1002/asi.22775.

    Article  Google Scholar 

  • Waltman, L., & van Eck, N. J. (2015). Field-normalized citation impact indicators and the choice of an appropriate counting method. Journal of Informetrics, 9(4), 872–894. doi:10.1016/j.joi.2015.08.001.

    Article  Google Scholar 

  • Wang, J., & Shapira, P. (2015). Is there a relationship between research sponsorship and publication impact? An analysis of funding acknowledgments in nanotechnology papers. PLoS One, 10(2), e0117727. doi:10.1371/journal.pone.0117727.

    Article  Google Scholar 

  • Winterhager, M., Schwechheimer, H., & Rimmert, C. (2014). Institutionenkodierung als Grundlage für bibliometrische Indikatoren. Bibliometrie—Praxis und Forschung, 3(14), 1–22.

    Google Scholar 

  • Yegros-Yegros, A., & Costas, R. (2013). Analysis of the web of science funding acknowledgement infomation for the design of indicators on external funding attraction. In J. Gorraiz, E. Schiebel, C. Gumpenberger, M. Hörlesberger, & H. Moed (Eds.), Proceedings of the International Conference on Scientometrics and Informetrics (S. 84–95). Vienna: 14th International Society of Scientometrics and Informetrics Conference (ISSI).

  • Zuber, S., & Hüther, O. (2013). Interdisziplinarität in der Exzellenzinitiative—auch eine Frage des Geschlechts? Beiträge zur Hochschulforschung, 4, 54–81.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Torger Möller.

Appendix

Appendix

In Table 3 we recap the abbreviations and the explanations from Fig. 3 (also described in the text after Fig. 3). In addition, the number of publications for the pre-funding and the funding period are given for each unit of investigation. In Table 4 annual data for Germany and the German university sector [P, PP (top 10 %)] is reported.

Table 3 English full names corresponding to the abbreviations and publication counts
Table 4 Annual data for Germany and the German university sector

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Möller, T., Schmidt, M. & Hornbostel, S. Assessing the effects of the German Excellence Initiative with bibliometric methods. Scientometrics 109, 2217–2239 (2016). https://doi.org/10.1007/s11192-016-2090-3

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-016-2090-3

Keywords

Navigation