Skip to main content
Log in

On the effects of institutional size in university classifications: the case of the Shanghai ranking

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

University rankings frequently struggle to delineate the separate contributions of institutional size and excellence. This presents a problem for public policy and university leadership, for example by blurring the pursuit of excellence with the quest for growth. This paper provides some insight into the size/excellence debate by exploring the explicit contribution of institutional size to the results of the Shanghai ranking indicators. Principal components analysis of data from the Shanghai ranking (2013 edition) is used to explore factors that contribute to the variation of the total score. The analysis includes the five non-derived ARWU indicators (Alumni, Award, HiCi, S&N and PUB) and uses the number of equivalent full-time academic staff (FTE) as a measure of size. Two significant but unequal factors are found, together explaining almost 85 % of the variance in the sample. A factor clearly associated with the size of the institution explains around 30 % of the variance. To sharpen the interpretation of the smaller factor as a measure of the effect of size, we extend the analysis to a larger set of institutions to eliminate size-dependent selection effects. We also show that eliminating outlying universities makes little difference to the factors. Our inferences are insensitive to the use of raw data, compared with the compressed and scaled indicators used by ARWU. We conclude that around 30 % of the variation in the ARWU indicators can be attributed to variation in size. Clearly, size-related factors cannot be overlooked when using the ranking results. Around 55 % of the variation arises from a component which is uncorrelated with size and which measures the quality of research conducted at the highest levels. The presence of this factor encourages further work to explore its nature and origins.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  • Adcock, R., & Collier, D. (2001). Measurement validity: A shared standard for quantitative and qualitative research. The American Political Science Review, 95(3), 529–546.

    Article  Google Scholar 

  • Aghion, P., Dewatripont, M., Hoxby, C., Mas-Colell, A., & Sapir, A. (2008). Higher aspirations: An agenda for reforming European universities: Bruegel blueprint V. Downloaded from aei.pitt.edu on March 2, 2013. http://aei.pitt.edu/id/eprint/8714.

  • Bartlett, M. S. (1954). A note on the multiplying factors for various chi-square approximations. Journal of the Royal Statistical Society, 16(B), 296–298.

    MATH  MathSciNet  Google Scholar 

  • Billaut, J. C., Bouyssou, D., & Vincke, P. (2010). Should you believe in the Shangai ranking: An MCDM view. Scientometrics, 84(1), 237–263.

    Article  Google Scholar 

  • Boulton, G. (2010). University rankings: Diversity, excellence and the European initiative. Downloaded from www.leru.org on March 10, 2014. http://www.leru.org/files/publications/LERU_AP3_2010_Ranking.

  • Dehon, C., McCathie, A., & Verardi, V. (2010). Uncovering excellence in academic rankings: A closer look at the Shanghai ranking. Scientometrics, 83(2), 515–524.

    Article  Google Scholar 

  • Dickinson, C. (2009). Small but beautifully formed to compete. The Independent, 30.

  • Docampo, D. (2011). On using the Shanghai ranking to assess the research performance of university systems. Scientometrics, 86(1), 77–92.

    Article  Google Scholar 

  • Docampo, D. (2013). Reproducibility of the Shanghai academic ranking of world universities results. Scientometrics, 94(2), 567–587.

    Article  Google Scholar 

  • Docampo, D., & Cram, L. (2014). On the internal dynamics of the Shanghai ranking. Scientometrics, 98(2), 1347–1366.

    Article  Google Scholar 

  • Fellegi, I. (1975). Automatic editing and imputation of quantitative data. Journal of American Statistcs Association, 46, 249–253.

    Google Scholar 

  • Flury, B. D. (1997). A first course in multivariate statistics. New York: Springer.

    Book  MATH  Google Scholar 

  • Gnanadesikan, R., & Kettenring, J. R. (1972). Robust estimates, residuals, and outlier detection with multiresponse data. Biometrics, 28, 81–124.

    Article  Google Scholar 

  • Grammaticos, B. (2007). The physical basis of scoring athletic performance. New Studies in Athletics, 22(3), 47–53.

    MathSciNet  Google Scholar 

  • Hawkins, D. M. (1974). The detection of errors in multivariate data using principal components. Journal of American Statistcs Association, 69, 340–344.

    Article  MATH  Google Scholar 

  • Hazelkorn, E. (2011). Rankings and the reshaping of higher education the battle for world-class excellence. Oxford: Palgrave Macmillan.

    Book  Google Scholar 

  • HECFE. (2008). Counting what is measured or measuring what counts? League tables and their impact on higher education institutions in England. Downloaded from www.hefce.ac.uk on September 20, 2013. www.hefce.ac.uk/media/hefce1/pubs/hefce/2008/0814/08_14.

  • IAAF. (2004). International Association of Athletics Federations scoring tables for combined events. Downloaded from the IAAF server on March 14, 2013. http://www.iaaf.org.

  • Jolliffe, I. T. (2002). Principal component analysis (2nd ed.). New York: Springer.

    MATH  Google Scholar 

  • Kaiser, H. (1974). An index of factorial simplicity. Psychometrika, 39, 31–36.

    Article  MATH  Google Scholar 

  • Liu, N. C., & Cheng, Y. (2005). Academic ranking of world universities: Methodologies and problems. Higher Education in Europe, 30(2), 127–136.

    Article  Google Scholar 

  • Morrison, D. (2000). Multivariate statistical methods (3rd ed.). New York: McGraw-Hill.

    Google Scholar 

  • Nokkala, T., Heller-Schuh, B., & Paier, M. (2011). Branking lists and European framework programmes: does university status matter for performance in framework programmes?. In P. N. Teixeira & D. D. Dill (Eds.), Public vices private virtues: Assessing the effects of marketization in higher education (Part 3, pp. 111–140). Boston: Sense.

  • Rao, C. (1964). The use and interpretation of principal component analysis in applied research. Sankhya A, 26, 329–358.

    MATH  Google Scholar 

  • Rauhvargers, A. (2011). Global university rankings and their impact. European University Association report on rankings. http://www.eua.be/pubs/Global_University_Rankings_and_Their_Impact. Downloaded from the EUA server on April 30, 2013.

  • Ravallion, M., & Wagstaff, A. (2010). On Measuring scientific influence. World Bank policy research working paper no. 5375. Downloaded from the elibrary of the World Bank server on April 21, 2013. http://elibrary.worldbank.org/content/workingpaper/10.1596/1813-9450-5375.

  • Rawlings, H. R., Hilmer, F. G., Huber, B., et al. (2013). The ten characteristics of contemporary research universities. http://www.leru.org/files/news/Hefei_statement. Downloaded from www.leru.org on March 10, 2014.

  • Sawyer, K., Sankey, H., & Lombardo, R. (2013). Measurability invariance, continuity and a portfolio representation. Measurement, 46, 89–96.

    Article  Google Scholar 

  • Siganos, A. (2008). Rankings, governance, and attractiveness of higher education: The new French context. Higher Education in Europe, 33(2–3), 311–316.

    Article  Google Scholar 

  • Stevens, J. (1996). Applied multivariate statistics for the social sciences. Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Tabachnick, B. G., & Fidell, L. S. (2007). Using multivariate statistics (5th ed.). Boston: Pearson Education/Allyn and Bacon.

    Google Scholar 

  • Todorov, V., & Filzmoser, P. (2009). An object-oriented framework for robust multivariate analysis. Journal of Statistical Software, 32(3), 1–47.

    Google Scholar 

  • van Raan, A. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.

    Article  Google Scholar 

  • van Vught, F., & Ziegele, F. (2011). Design and testing the feasibility of a multidimensional global university ranking. Report commissioned by the Directorate General for Education and Culture of the European Commission. Consortium for Higher Education and Research Performance Assessment CHERPA-Network.

  • Verboven, S., & Hubert, M. (2010). Matlab library LIBRA. Wiley Interdisciplinary Reviews: Computational Statistics 04/2010, 2(4), 509–515.

    Article  Google Scholar 

  • Zitt, M., & Filliatreau, G. (2007). Big is (made) beautiful: Some comments about the Shangai ranking of world-class universities. In The world class universities and ranking: Aiming beyond status (Part 2, pp. 141–160). Romania: UNESCO-CEPES, Cluj University Press.

  • Zuur, A. F., Ieno, E. N., & Alphick, C. S. (2010). A protocol for data exploration to avoid common statistical problems. Methods in Ecology and Evolution, 1, 3–14.

    Article  Google Scholar 

Download references

Acknowledgments

The work of D. Docampo was supported by the European Regional Development Fund (ERDF) and the Galician Regional Government under agreement for funding the Atlantic Research Center for Information and Communication Technologies (AtlantTIC).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Domingo Docampo.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Docampo, D., Cram, L. On the effects of institutional size in university classifications: the case of the Shanghai ranking. Scientometrics 102, 1325–1346 (2015). https://doi.org/10.1007/s11192-014-1488-z

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-014-1488-z

Keywords

Navigation