The measurement of academic performance is a relevant issue at the intersection between political science and informetrics. Numerous international comparisons (rankings) of Higher Education Institutions (HEIs) are regularly published—such as Shanghai, Times Higher Education and Leiden Ranking; nonetheless, HEIs performance analysis still represents a challenging task. Higher education systems are complex, characterized by multi-levels (course, institution, nation, etc.), multi-objectives (i.e. teaching, research, third mission activities) and heterogeneity.
Heterogeneity is one of the main critical issues to address in any benchmarking analysis. The comparability of the units of analysis is a necessary condition for any meaningful relative assessment or quantitative evaluation. Attention to the topic has been dedicated for long from both scholars and policy-makers; nevertheless, the diversity in higher education systems results to be difficult to tackle, a general conceptualization is still lacking (Huisman et al. 2015) and the empirical analysis of the related literature seems to lead to contradictory outcomes (Barbato and Turri 2019).
The choice of the most salient dimensions of heterogeneity is still controversial. Multiple sources are associated to heterogeneity, including the national context, the HEIs mission, the presence or absence of medical schools, the institutions’ legal status and the adopted disciplinary orientation and degree of specialization (López-Illescas et al. 2011; Daraio et al. 2011). The dimension of internationalization has also been considered in recent studies, with nations that become increasingly interdependent and internationalization missions that are currently embodied in universities’ strategies (Huisman et al. 2015). Differences in performance outcomes could also originate from different levels of autonomy and/or competitiveness experimented by universities (Aghion et al. 2010) and the economic development of their contexts (with more influence on research-related activities, rather than the teaching one; Agasisti and Bertoletti 2019).
One-dimensional approaches to the HEIs performance evaluation entangle the risk of potentially unbalanced or even invalid conclusions, forcing a homogeneous vision of success/failure, mission, characteristics. The literature is moving towards more complex methodological approaches, trying to include progressively a more multi-dimensional perspective; including investigations on how, and to which extent, elements of heterogeneity influence performance. It should be taken into account that, due to the heterogeneity and multi-dimensionality, a real overall valid HEIs “classification” is difficult to obtain.
Bonaccorsi and Daraio (2009) is one of the first attempts in analysing extensive data from different European countries with the aim of tackling their heterogeneity. Using a database from the AQUAMETH project,Footnote 1 they identified through cluster analysis different performance profiles across European countries, relating them to different strategic orientation adopted by the single institution (research oriented, teaching oriented, multi-purposes). Similar results were obtained by García-Aracil and Palomares-Montero (2012) and de la Torre et al. (2018), both with respect to the Spanish higher education system. The former, applying a cluster analysis, identified 3 groups: research-oriented universities, teaching oriented universities and Knowledge Transfer (KT)-oriented universities. The latter, applying a so-called DEA-MDS multidimensional analysis, identified 6 groups: universities oriented towards efficiency in the traditional missions (particularly teaching), universities oriented towards efficiency in research, universities oriented towards the efficiency in the traditional missions, universities oriented towards overall efficiency, universities oriented towards efficiency in KT, regional universities oriented towards efficiency in research and KT. The results obtained in the present work confirm the same line of categorization, working with extended database and more dimensions.
Daraio et al. (2011) provide an investigation on the identification of the heterogeneity, considering horizontal heterogeneity (i.e. decisions on subject mix, target audience, teaching methodologies, type of research, type of third mission’s activities etc.) and vertical heterogeneity (i.e. positioning of the university in a hierarchy of quality of university service provision).
Catalano et al. (2017), focusing on the sources of heterogeneity induced by the subject mix of HEIs, and using the ETER database, propose to estimate “scale parameters” representing European students in different fields of education (namely, Engineering, Medicine, Natural and Physical Sciences, Social Sciences and Humanities), as tool to be able to compare educational production across different fields on a common ground.
Similarly, Zharova et al. (2017), using micro-level data on publications and citations (Scopus) over selected HEIs in Germany, identify differences across research fields over (i) the relationships between funding volume and research productivity and number of citations; (ii) the influence of past research results on likeliness to obtain external funding; (iii) the optimal answer to exogenous changes. The evaluation of research performance by disaggregating the disciplinary fields to low levels is also proposed by Bonaccorsi and Secondi (2017), that shows how research performance depends on variables at the level of university (e.g. size, teaching, governance) and the level of external regional environment (general effects—level of development of the region, expenditure in R&D and technological intensity of the manufacturing sector; specific effect—variables used to sizing the health sector).
Finally, Barbato and Turri (2019) compare two European countries, namely England and Italy, by considering different dimensions (core functions, subject mix, market size, structural information). Institutional positioning has been defined by Fumasoli and Huisman (2013) as the process through which HEIs locate themselves in specific niches within the HE system, reflecting the activities, resources (e.g. financial, human) and potential relations (competition, cooperation) that they assume to prosper in their system. Barbato and Turri (2019) identify two main approaches in positioning: more or less passive adaptation in the direction indicated by context external forces, and deliberate or emergent strategy. Institutional pressure (government regulation) and competition (students, researchers, funds, reputation) are the two main important external forces that impact on HEIs. The analysis results indicate a more differentiated system in England, while in general both Italian and English HEIs are becoming increasingly homogeneous in terms of research intensity, and increasingly more heterogeneous in terms of internationalisation.
In this context, it would be important for the research community and the policy maker, to be able to understand how heterogeneity would be tackled and how heterogeneity can influence the performance. The present work adopts a multi-level perspective by combining national (macro) level data and institution (micro) level data and analyses; also showing the potential in using micro-level data to characterize the national level performance. We consider a systemic perspective, integrating heterogeneous sources of available data, covering all the three dimensions of HEIs production process (namely, teaching, research and third mission) and including information on the national regulation measures introduced over time.
The current paper’s objective is to characterize HEIs while accounting for the following aspects:
Structural heterogeneity (structure of the national system: systemic factors, e.g. number and types of HEIs that are involved, governance factors);
Internal heterogeneity (linked to the type of the production process carried out within the HEIs);
Other heterogeneity sources.
The analysis is focused on the European context. The European HEIs have been proved to be less performing if compared with their US counterparts (Aghion et al. 2010), making crucial to create tools useful for improvement. The US higher education system is characterized by significantly higher resources and a clear distinction between education-oriented institutions and doctoral universities, associated to overall higher volume of publications and citations with respect to revenues (Lepori et al. 2019).
Regulation settings, traditions, economic development contexts highly varying, substantially influence the level of heterogeneity between and within countries (Bonaccorsi 2014). The modernisation agenda for Higher Education in Europe (European Commission 2016) identifies the relevance of creating effective governance and funding mechanisms for higher education. Different models of governance (Agasisti and Catalano 2006; Capano et al. 2015) are applied by policy makers trying to improve the systemic performance of Higher Education, resulting at a European level in designs that represent each country’s proper interpretation of a common template. After 30 years of adaptations, three systemic governance factors seem to have emerged (Capano and Pritoni. 2019): a performance-based mode, a re-regulated mode and a systemic goal-oriented mode.
Finally, reliable data recently started to be available, thanks to important advancement in data collection and data processing procedures and the activation of specific research projects with the aim of creating broad databases, with good coverage on different countries and different years (i.e. AQUAMETH—see Daraio et al. 2011, EUMIDA—see Bonaccorsi 2014, ETERFootnote 2).
This work presents results from a larger project (see Acknowledgements), aimed to study the activities, the performances and the efficiencies of European HEIs. It focuses on a statistical exploration of a series of indicators linking education, in a systemic way, with research and third mission. In terms of data analysis, it explores the combination of statistical data from ETER, the European Tertiary Education Register, with bibliometric data obtained from the Leiden Ranking,Footnote 3 with information on innovation activities from PATSTAT database and with categorizations of national higher education policies obtained from more qualitative studies of national HEI systems. Notice that, in our analysis a series of variables associated with patents activities and funds composition are used as proxies for the third mission activities.
The third mission refers to the economic and social impacts generated by HEIs’ activities through interactions with the external stakeholders, aiming to generate, apply and exploit knowledge (Secundo et al. 2017). Nevertheless, the concept still lacks a specific and unambiguous definition, mainly due to its dependence on contextual circumstances (Pinheiro et al. 2015). This unclearness contributes to a critical data availability issue, along with the difficulties associated with output quantification and measurement, especially with regard to the societal dimensions. According to the available data, it was then decided to represent the third mission only through its innovation side, following well-established approaches in the literature.
In the project the existing problems of data availability, quantification and comparability go hand in hand with the need for conceptualization of the performance model before making the analysis (Daraio and Bonaccorsi 2017). The notion of performance is characterized in a “progressive” way, starting from production (“volume” or extensive variables), going to productivity (intensive or “size-independent” indicators of production), up to efficiency (combination of outputs/inputs) and more elaborated efficiency models, towards effectiveness and impact (Daraio 2019).
The structure of this paper is as follows. “Method” Section gives an overview of the methods that were applied in the study, and “Data” Section a detailed description of the data sources. The results are presented in three parts. As an introduction to the analyses, “Basic information on national higher education systems” Section outlines the set of higher educational institutions analysed in the paper and gives a characterisation of national HE systems in terms of governance structures. “Results from the Cluster and efficiency analyses” Section presents the outcomes of a cluster analysis of higher education institutions based on their similarity in terms of their bibliometric and governance characteristics, and focuses on the notion of efficiency of the institutions across clusters. Next, “Additional methodological approaches and case studies” Section illustrates two additional studies that represent lines of future research, aimed to further broaden the insights into the performance of higher education institutions and into the factors that influence this performance. The first relates to the methodology to identify clusters, and the second to case studies providing a detailed comparison of particular countries. Finally, “Discussion and conclusions” Section summurises the results and makes more suggestions for further research.
The main objective of this work is to characterize the heterogeneity of HE systems (at a country level and systemic level), exploiting micro-level data and making use of a multi-methodological approach (combining qualitative exploratory analysis, clustering analysis and efficiency evaluation). Beside this multi-methodology, another novelty of the paper is represented by the analyzed sample, in terms of covered countries and data quality.
Some of the results obtained by our analysis confirm previous literature by using a more complete database, adopting more recent and comparable data.