Higher Education

, Volume 75, Issue 2, pp 341–363 | Cite as

Persistent factors facilitating excellence in research environments

  • Evanthia Kalpazidou SchmidtEmail author
  • Ebbe Krogh Graversen


The study presented here identifies robust and time-invariant features that characterise dynamic and innovative research environments. It takes as its point of departure the results of an empirical study conducted in 2002 which identified the common characteristics of 15 dynamic and innovative public research environments, and focusses on their development by revisiting the environments after more than a decade, hence mapping them in the current research landscape. Using a model for studies of research environments that was constructed and used in the Nordic countries, the study maps both internal elements and those in the framework of the environments that influence research performance and identifies persistent factors in dynamic and innovative research environments. The findings add to our understanding of how to improve the overall ecology of knowledge production and create optimal conditions that support research environments in pursuing and ensuring excellence. Implications for further research and policy are discussed.


Research environments Persistent factors Knowledge production Excellence 



This work was supported by Aarhus University’s Senior Management Strategic Fund.

Compliance with ethical standards

Disclosure statement

No potential conflict of interest.

Supplementary material

10734_2017_142_MOESM1_ESM.docx (21 kb)
ESM 1 (DOCX 20 kb).


  1. Abramo, G., D’Angelo, C. A., & Di Costa, F. (2011). National research assessment exercises: a comparison of peer review and Bibliometrics rankings. Scientometrics, 89, 929–941.CrossRefGoogle Scholar
  2. Aksnes, D. W., & Taxt, R. E. (2004). Peer reviews and bbliometric indicators: a comparative study at a Norwegian university. Research Evaluation, 13, 33–41.Google Scholar
  3. Allison, P. D., & Long, J. S. (1990). Departmental effects on scientific productivity. American Sociological Review, 55(4), 469–478.CrossRefGoogle Scholar
  4. Andrews, F.M. (Ed.) (1979). Scientific productivity. The effectiveness of research groups in six countries. Paris: Cambridge University Press, Paris.Google Scholar
  5. Auranen, O. (2014). University research performance. Influence of funding competition, policy steering and micro-level factors. Tampere, Finland: Tampere University Press.Google Scholar
  6. Auranen, O., & Nieminen, M. (2010). University research funding and publication performance. An international comparison. Research Policy, 39(6), 822–834.CrossRefGoogle Scholar
  7. Barber, B. (1952). Science and the social order. New York: The Free Press.Google Scholar
  8. Barnes, B., & Edge, D. (Eds.). (1982). Science in context. Readings in the sociology of science. Stony Stratford: Open University Press.Google Scholar
  9. Becher, T., & Trowler, P. (2001). Academic tribes and territories: Intellectual enquiry and the culture of the disciplines (2nd ed.). London: Routledge.Google Scholar
  10. Bleiklie, I., & Kogan, M. (2007). Organization and governance of universities. Higher Education Policy, 20(4), 477–493.CrossRefGoogle Scholar
  11. Bloch, C.W.; Schneider, J.W.; Sinkjær, T. (2016). Size, accumulation and performance for research grants: examining the role of size for centres of excellence. PLoS One, 2016, s.1–17.Google Scholar
  12. Borlaug, S. B. (2015). Innovation and excellence in research policy – external steering, internal responses. PhD thesis. Oslo: University of Oslo.Google Scholar
  13. Bornmann, L., & Daniel, H. D. (2008) “What do citation counts measure? A review of studies on citing behavior”, Journal of Documentation, 64(1), 45–80.Google Scholar
  14. Bornmann, L., de Moya-Anegon, F., & Leydesdorff, L. (2010). Do scientific advancements lean on the shoulders of giants? A bibliometric investigation of the Ortega hypothesis. PloS One, 5(10), e11344.CrossRefGoogle Scholar
  15. Bozeman, B., & Boardman, C. (2003). Managing the new multipurpose, multidiscipline university research center: Institutional innovation in the academic community. Washington DC: IBM Endowment for the Business of Government. Accessed 1 Sep 2012.Google Scholar
  16. Carayol, N., & Matt, M. (2006). Individual and collective determinants of academic scientists’ productivity. Information Economics and Policy, 18(1), 55–72.CrossRefGoogle Scholar
  17. Clark, B. R. (1983). The higher education system: Academic organization in cross-national perspective. Berkeley: University of California Press.Google Scholar
  18. Cole, J., & Cole, S. (1973). Social stratification in science. Chicago: The University of Chicago Press.Google Scholar
  19. Corley, E. A., Boardman, P. C., & Bozeman, B. (2006). Design and the management of multi-institutional research collaborations: Theoretical implications from two case studies. Research Policy, 35, 975–993.CrossRefGoogle Scholar
  20. Dahlløf, U. (1982). Faculty profiles in a long-term and comparative perspective. In Bélanger, C.E. (ed). The universities in a changing world. Adaptation or guidance. Proceedings Fourth European AIR Forum, Uppsala University 25–27 August, 1982.Google Scholar
  21. Danish Government (2006). Globalization strategy. Progress, innovation and cohesion. Copenhagen: Strategy for Denmark in the Global Economy.Google Scholar
  22. DiMaggio, P. J., & Powell, W. W. (1991). Introduction. In W. W. Powell & P. J. DiMaggio (Eds.), The new institutionalism in organizational analysis (pp. 1–38). Chicago and London: The University of Chicago Press.Google Scholar
  23. Engels, T. C. E., Goos, P., Dexters, N., & Spruyt, E. H. J. (2013). Group size, h-index, and efficiency in publishing in top journals explain expert panel assessments of research group quality and productivity. Research Evaluation, 22, 224–236.CrossRefGoogle Scholar
  24. Etzkowitz, H., & Kemelgor, C. (1998). The role of research centres in the collectivisation of academic science. Minerva, 36, 271–288.CrossRefGoogle Scholar
  25. European Commission. (2009). Mutual learning on approaches to improve the excellence of research in universities. Luxembourg: Publications Office of the European Union/ CREST Fourth OMC Working Group.Google Scholar
  26. Ferlie, E., Musselin, C., & Andresani, G. (2009). The governance of higher education systems: a public management perspective. In Paradeise, C., Reale, E., Bleiklie, I., and Ferlie, E. (Eds), University Governance. Western European Comparative Perspectives (pp. 1–20). Berlin Heidelberg: Springer.Google Scholar
  27. Franceschet, M., & Costantini, A. (2011). The first Italian research assessment exercise: A bibliometric perspective. Journal of Informetrics, 5(2), 275–291.CrossRefGoogle Scholar
  28. Frølich, N., Huisman, J., Slipersæter, S., Stensaker, B., & Pimentel Botas, P. C. (2013). A reinterpretation of institutional transformations in European higher education: Strategising pluralistic organisations in multiplex environments. Higher Education, 65, 79–93.CrossRefGoogle Scholar
  29. Galtung, J. (1977). Methodology and ideology, vol. 1. Copenhagen: Ejlers Forlag.Google Scholar
  30. Gibbons, M., Limoges, C., Nowotony, H., Schwartzman, S., Scott, P., & Trow, M. (1994). The new production of knowledge: The dynamics of science and research in contemporary societies. London: Sage Publications.Google Scholar
  31. Gläser, J., Lange, S., Laudel, G., & Schimank, U. (2010). Informed authority? The limited use of research evaluation systems for managerial control in universities. In R. Whitley, J. Gläser, & L. Engwall (Eds.), Reconfiguring knowledge production: Changing authority relationships in the sciences and their consequences for intellectual innovation (pp. 149–183). New York: Oxford University Press.CrossRefGoogle Scholar
  32. Graversen, E. K., Kalpazidou Schmidt, E. & Langberg, K. (2005). Dynamic Research Environments – A Development Model. The International Journal of Human Resource Management, 16(8), 1498–1511.Google Scholar
  33. Groot, T., & Garcia-Valderrama, T. (2006). Research quality and efficiency—an analysis of assessments and management issues in Dutch economics and Business research programs. Research Policy, 35(9), 1362–1376.CrossRefGoogle Scholar
  34. Gulbrandsen, & Smeby, J. C. (2005). Industry funding and university professors’ research performance. Research Policy, 34(6), 932–950.CrossRefGoogle Scholar
  35. Hagstrom, W. O. (1965). The scientific community. New York: Basic books.Google Scholar
  36. Hammarfelt, B., Nelhans, G., Eklund, P., & Åström, F. (2016). The heterogeneous landscape of bibliometric indicators: Evaluating models for allocating resources at Swedish universities. Research Evaluation, 2016, 1–14.Google Scholar
  37. Hansen, H. F. (2012). Fusionsprocesserne: Frivillighed under tvang. In K. Aagaard & N. Mejlgaard (Eds.), Dansk Forskningspolitik efter Årtusindskiftet (pp. 195–228). Aarhus: Aarhus University Press.Google Scholar
  38. Heinze, T., Shapira, P., Rogers, J. D., & Senker, J. M. (2009). Organizational and institutional influences on creativity in scientific research. Research Policy, 38, 610–623.CrossRefGoogle Scholar
  39. Hemlin, S., Allwood, C. M., & Martin, B. R. (2008). Creative knowledge environments. Creativity Research Journal, 20, 196–210.CrossRefGoogle Scholar
  40. Hemlin, S., Allwood, C. M., Martin, B. R., & Mumford, M. D. (2014). Creativity and leadership in science, technology, and innovation. Routledge studies in innovation, organization and technology. New York and London: Routledge.Google Scholar
  41. Howe, M. J., Davidson, J. W., & Sloboda, J. A. (1998). Innate talents: Reality or myth? Behavioral and Brain Sciences, 21(3), 399–407.CrossRefGoogle Scholar
  42. Kalpazidou, Schmidt E. (2009). Nordic higher education systems in the European higher education area and the European research area. Education et Societès, 24, 45–62.Google Scholar
  43. Kalpazidou, Schmidt E. (2012). University funding reforms in the Nordic countries. In F. Maruyama & I. Dobson (Eds.), Cycles of university reforms: Japan and Finland compared (pp. 31–56). Tokyo: Centre for National University Finance and Management.Google Scholar
  44. Kalpazidou Schmidt, E. (1996). Research environments in a Nordic perspective. A comparative study in ecology and scientific productivity. Acta Universitatis Uppsaliensis. Uppsala studies in education 67. Stockholm: Almquist & Wiksell, International.Google Scholar
  45. Kalpazidou Schmidt, E., Graversen E. K., & Langberg K. (2003). Innovation and Dynamics in Public Research Environments in Denmark: A Research Policy Perspective, Science and Public Policy, vol. 30(2), 107–116.Google Scholar
  46. Katz, J. S., & Martin, B. R. (1997). What is research collaboration? Research Policy, 26(1), 1–18.CrossRefGoogle Scholar
  47. Keith, B., & Babchuk, N. (1998). The quest for institutional recognition: A longitudinal analysis of scholarly productivity and academic prestige among sociology departments. Social Forces, 76(4), 1495–1533.CrossRefGoogle Scholar
  48. Knorr Cetina, K. D. (1981). The manufacture of knowledge. Oxford: Pergamon Press.Google Scholar
  49. Knorr Cetina, K. D. (1982). Scientific communities or Transepistemic arenas of research. Social Studies of Science, 12, 101–130.CrossRefGoogle Scholar
  50. Kraatz, M. S., & Block, E. S. (2008). Organizational implications of institutional pluralism. In R. Greenwood, C. Oliver, K. Sahlin, & R. Suddaby (Eds.), The sage handbook of organizational institutionalism (pp. 243–275). London: Sage Publications.CrossRefGoogle Scholar
  51. Kwiek, M. (2016). The European research elite: a cross-national study of highly productive academics in 11 countries. Higher Education, 71(3), 379–397.CrossRefGoogle Scholar
  52. Langfeldt, L., Benner, M., Sivertsen, G., Kristiansen, E. H., Aksnes, D. W., Brorstad Borlaug, S., Hansen, H. F., Kallerud, E., & Pelkonen, A. (2015). Excellence and growth dynamics: A comparative study of the Matthew effect. Science and Public Policy, 42, 661–675.CrossRefGoogle Scholar
  53. Latour, B., & Woolgar, S. (1979). Laboratory life: The social construction of scientific facts. London: Sage Publications.Google Scholar
  54. Laudel, G. (2006). The art of getting funded: how scientists adapt to their funding conditions. Science and Public Policy, 33(7), 489–504.CrossRefGoogle Scholar
  55. Law, J. (1973). The development of specialities in science: the case of X-ray protein crystallography. Sciences Studies, 3, 275–303.Google Scholar
  56. Lawrenz, F., Thao, M., & Johnson, K. (2012). Expert panel reviews of research centers: The site visit process. Evaluation and Program Planning, 35, 390–397.CrossRefGoogle Scholar
  57. Lee, S., & Bozeman, B. (2005). The impact of research collaboration on scientific productivity. Social Studies of Science, 35(5), 673–702.CrossRefGoogle Scholar
  58. Leisyte, L., & Dee, J. (2012). Understanding academic work in changing institutional environment. In Smart, J.C., & Paulsen, M.B. (Eds), Higher education: handbook of theory and research, vol 27 (pp. 123–206). Dordrecht: SpringerGoogle Scholar
  59. Leisyte, L., & Enders, J. (2011). The strategic responses of English and Dutch university life scientists to the changes in their institutional environments. In J. Enders, H. de Boer, & D. Westerheijden (Eds.), Higher education reform in Europe (pp. 143–157). Rotterdam: Sense Publishers.Google Scholar
  60. Lepori, B., Van den Besselaar, P., Dinges, M., Poti, B., Reale, E., Slipersaeter, S., et al. (2007). Indicators for comparative analysis of public project funding: Concepts, implementation and evaluation. Research Evaluation, 16(4), 243–256.CrossRefGoogle Scholar
  61. Li, D., & Agha, L. (2015). Big names or big ideas: do peer-review panels select the best science proposals? Science, 348(6233), 434–438. doi: 10.1126/science.aaa0185.CrossRefGoogle Scholar
  62. Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Beverly Hills: Sage Publications.Google Scholar
  63. Martin, B. R. (2003). The changing social contract for science and the evolution of the university. In A. Geuna, A. J. Salter, & W. E. Steinmueller (Eds.), Science and innovation. Rethinking the rationales for funding and governance (pp. 7–29). Cheltenham: Edward Elgar.Google Scholar
  64. Merton, R. (1968). The Matthew effect in science. Science, 159(3810), 56–63.CrossRefGoogle Scholar
  65. Moed, H. F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.Google Scholar
  66. Mumford, M. D., Scott, G. M., Gaddis, B., & Strange, J. M. (2002). Leading creative people: orchestrating expertise and relationships. Leadership Quarterly, 13(6), 705–750.CrossRefGoogle Scholar
  67. Münch, R. (2014). Academic capitalism. Universities in the global struggle for excellence. Routledge advances in sociology 121. New York: Taylor & Francis.Google Scholar
  68. Nowotny, H., Scott, P., & Gibbons, M. (2001). Re-thinking science: Knowledge and the public in an age of uncertainty. Oxford: Polity Press.Google Scholar
  69. Olsson, L., Hemlin, S., & Pousette, A. (2012). A multi-level analysis of leader-member exchange and creative performance in research groups. The Leadership Quarterly, 23, 604–619.CrossRefGoogle Scholar
  70. Öquist, G., & Benner, M. (2012). Fostering breakthrough research: A comparative study. Stockholm: Kungliga Vetenskapsakademien.Google Scholar
  71. Organisation for Economic Co-operation and Development (OECD) (2014). Promoting research excellence: new approaches to funding. Paris: OECD.Google Scholar
  72. Orr, D., Jaeger, M., & Wespel, J. (2011). New forms of incentive funding for public research: A concept paper on research excellence initiatives. Paris: OECD.Google Scholar
  73. Pelz, D. C., & Andrews, F. M. (1966). Scientists in organizations: productive climates for research and development. New York: Wiley.Google Scholar
  74. Pietilä, M. (2014). The many faces of research profiling: Academic leaders’ conceptions of research steering. Higher Education, 67(3), 303–316.CrossRefGoogle Scholar
  75. Pruvot, E. B., & Estermann, T. (2015). Funding for excellence: Define thematic report. Brussels: European University Association.Google Scholar
  76. Ramsden, P. (1994). Describing and explaining research productivity. Higher Education, 28(2), 207–226.CrossRefGoogle Scholar
  77. Rinia, E. J., van Leeuwen, T., van Vuren, H. G., & van Raan, A. F. J. (1998). Comparative analysis of a set of Bibliometric indicators and central peer review criteria: Evaluation of condensed matter physics in the Netherlands. Research Policy, 27, 95–107.CrossRefGoogle Scholar
  78. Rinne, R., & Koivula, J. (2005). The changing place of the university and a clash of values. The entrepreneurial university in the European knowledge society. Higher Education Management & Policy, 17(3), 91–124.CrossRefGoogle Scholar
  79. Rip, A. (2011). The future of research universities. Prometheus, 29, 443–453.CrossRefGoogle Scholar
  80. Rons, N., De Bruyn, A., & Cornelis, J. (2008). Research evaluation per discipline: a peer-review method and its outcomes. Research Evaluation, 17, 45–57.CrossRefGoogle Scholar
  81. Rossi, F. (2009). Universities’ access to research funds: Do institutional features and strategies matter? Tertiary Education and Management, 15(2), 113–135.CrossRefGoogle Scholar
  82. Rostan, M., & Vaira, M. (Eds.) (2011). Questioning excellence in higher education: policies, experiences and challenges in national and comparative perspective (pp. 57–74). Dordrecth: Sense Publishers.Google Scholar
  83. Saarinen, T., & Välimaa, J. (2012). Change as an intellectual device and as an object of research. In B. Stensaker, J. Välimaa, & C. S. Sarrico (Eds.), Managing reform in universities: the dynamics of culture, identity and organizational change. New York: Palgrave Macmillan.Google Scholar
  84. Schubert, A., Glänzel, W., & Braun, T. (1988). Against absolute methods: relative scientometric indicators and relational charts as evaluation tools. In A. F. J. van Raan (Ed.), Handbook of quantitative studies of science and technology (pp. 137–175). Amsterdam: Elsevier.CrossRefGoogle Scholar
  85. Selznick, P. (1957). Leadership in administration. New York: Harper & Row.Google Scholar
  86. Slaughter, S., & Rhoades, G. (2004). Academic capitalism and the new economy: Markets, state, and higher education. Baltimore: Johns Hopkins University Press.Google Scholar
  87. Smeby, J., & Try, S. (2005). Departmental contexts and faculty research activity in Norway. Research in Higher Education, 46(6), 593–619.CrossRefGoogle Scholar
  88. Stokols, D., Hall, K. L., Taylor, B. K., & Moser, R. P. (2008). The science of team science—overview of the field and introduction to the supplement. American Journal of Preventive Medicine, 35(2), S77–S89.CrossRefGoogle Scholar
  89. Sørensen, M. P., Bloch, C., & Young, M. (2016). Excellence in the knowledge-based economy: From scientific to research excellence. European Journal of Higher Education, 6(3). doi: 10.1080/21568235.2015.1015106.
  90. Strehl, F., Reisinger, S., & Kalatschan, M. (Eds.). (2007). Funding systems and their effects on higher education systems: OECD education working papers, no. 6. Paris: OECD. doi: 10.1787/220244801417.Google Scholar
  91. Tammi, T. (2009). The competitive funding of university research: The case of Finnish science universities. Higher Education, 57, 657–679.CrossRefGoogle Scholar
  92. Thune, T., & Gulbrandsen, M. (2011). Institutionalization of university–industry interaction: a empirical study of the impact of formal structures on collaboration patterns. Science and Public Policy, 38, 99–107.Google Scholar
  93. Tijssen, R. J. W. (2003). Scoreboards of research excellence. Research Evaluation, 12(2), 91–103.CrossRefGoogle Scholar
  94. Van den Besselaar, P., & Leydesdorf, L. (2009). Past performance, peer review and project selection: a case study in the social and behavioral sciences. Research Evaluation, 18(4), 272–288.CrossRefGoogle Scholar
  95. Verbee, M., Horlings, E., Groenewegen, P., van der Weijden, I., & van den Besselaar, P. (2015). Organisational factors influencing scholarly performance: A multivariate study of biomedical research groups. Scientometrics, 102, 25–49.CrossRefGoogle Scholar
  96. Verbree, M., van der Weijden, I., & van den Besselaar, P. (2014a). Generation and life-cycle effects on academic leadership. In S. Hemlin, C. M. Allwood, B. R. Martin, & M. D. Mumford (Eds.), Creativity and leadership in science, technology, and innovation. Routledge studies in innovation, organization and technology (pp. 113–148). New York and London: Routledge.Google Scholar
  97. Verbree, M., van der Weijden, I., & van den Besselaar, P. (2014b). Academic leadership of high-performing research groups. In S. Hemlin, C. M. Allwood, B. R. Martin, & M. D. Mumford (Eds.), Creativity and leadership in science, technology, and innovation. Routledge studies in innovation, organization and technology (pp. 149–183). New York and London: Routledge.Google Scholar
  98. Weinberg, A. (1963). Criteria for scientific choice. Minerva, 1(2), 159–171.CrossRefGoogle Scholar
  99. Whitley, R. (2000). The intellectual and social organization of the sciences (2nd ed.). New York: Oxford University Press.Google Scholar
  100. Whitley, R. (1984). The intellectual and social organisation of the sciences. Oxford: Clarendon Press.Google Scholar
  101. Zaleznik, A. (1977). Managers and leaders. Are they different? Harvard Business Review, 55(3), 67–78.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2017

Authors and Affiliations

  • Evanthia Kalpazidou Schmidt
    • 1
    Email author
  • Ebbe Krogh Graversen
    • 1
  1. 1.Department of Political Science, Danish Centre for Studies in Research and Research PolicyAarhus UniversityAarhus CDenmark

Personalised recommendations