Advertisement

Springer Nature is making Coronavirus research free. View research | View latest news | Sign up for updates

Questioning the Shanghai Ranking methodology as a tool for the evaluation of universities: an integrative review

Abstract

This integrative review reports on methodological questions about the Shanghai Ranking as a tool for the evaluation of universities, questions that are extensible to other rankings. The paper presents a list of methodological problems that are the result of both a review of the literature and the authors’ knowledge, with the aim of improving and refining the ranking in line with the Berlin Principles. The second section makes proposals and provides explanatory notes for improving the evaluation of university institutions. A final inference is that any educational changes undertaken based on conclusions drawn from an institution’s ranking position must be considered highly controversial and questionable.

This is a preview of subscription content, log in to check access.

Notes

  1. 1.

    China's concern to internationalize its research and obtain recognition through, for example, the winning of Nobel prizes, has reached the point of obsession. Cao (2004, 2014) talks about the Nobel Prize complex or “Nobelmania” that existed in the absence of Chinese born scientists, with Chinese nationality at the time of the concession, working in a Chinese institution, until in 2015 the scientist Tu Youyou won the Medicine and Physiology prize for her contribution to the treatment of malaria.

    So then, as Huang (2015) illustrates, the Chinese way is still receptive to Western influence and external international ranking systems or organizations, and it has made impressive progress in selecting elite universities.

  2. 2.

    Notwithstanding, Indian researchers (Basu et al. 2016) propose the application of a multidimensional “Quality–Quantity' Composite Index” to rank India’s central universities, and there is a plethora of national ranking systems that seek ideographic contextualization. See Cakur et al. (2015) for a systematic comparison of national and global university ranking systems.

  3. 3.

    The other two rankings (Quacquarelli Symonds (QS) World University Ranking and THE-Times Higher Education World University Ranking) could be criticized, however, for the excessive weight, more than 60%, of the institutional reputation generated by the surveys. This makes them even more questionable.

  4. 4.

    López-Martín et al. (2018) reveal a validity problem of the Spanish U-Ranking (Fundación BBVA-IVIE 2017), which could be associated with a systematic error in predicting performance criteria through features that are not relevant to it.

References

  1. Aguillo, I. F., Bar-Ilan, J., Levene, M., & Ortega, J. L. (2010). Comparing university rankings. Scientometrics, 85(1), 243–256.

  2. ARWU. (2003). Academic Ranking of World Universities. Shanghai: Jiao Tong University-Center for World-Class Universities. http://www.shanghairanking.com/ARWU2003.html.

  3. Bagozzi, R. P. (2007). The legacy of the technology acceptance model and a proposal for a paradigm shift. Journal of the Association for Information Systems, 8(4), 244–254.

  4. Barron, G. R. S. (2017). The Berlin Principles on ranking higher education institutions: Limitations, legitimacy, and value conflict. Higher Education, 73(2), 317–333.

  5. Basu, A., Banshal, S. K., Singhal, K., & Singh, V. K. (2016). Designing a composite index for research performance evaluation at the national or regional level: Ranking central universities in India. Scientometrics, 107(3), 1171–1193.

  6. Billaut, J. C., Bouyssou, D., & Vincke, P. (2010). Should you believe in the Shanghai Ranking? Scientometrics, 84(1), 237–263.

  7. Bouchard, J. (2017). Academic media ranking and the configurations of values in higher education: A sociotechnical history of a co-production in France between the media, state and higher education (1976–1989). Higher Education, 73(6), 947–962.

  8. Bougnol, M.-L., & Dulá, J. H. (2015). Technical pitfalls in university rankings. Higher Education, 69(5), 859–866.

  9. Bowman, N. A., & Bastedo, M. N. (2011). An anchoring effect on assessments of institutional reputation. Higher Education, 61(4), 431–444.

  10. Cakur, M. P., Acarturk, C., Alasehir, O., & Cilingir, C. (2015). A comparative analysis of global and national university ranking systems. Scientometrics, 103(3), 813–848.

  11. Cao, C. (2004). Chinese science and the ‘Nobel Prize complex’. Minerva, 42(2), 151–172.

  12. Cao, C. (2014). The universal values of science and China’s Nobel Prize pursuit. Minerva, 52(2), 141–160.

  13. CEPES-Institute for Higher Education Policy. (2006). Berlin Principles on ranking of higher education institutions. Retrieved 21 Nov 2017. https://www.che.de/downloads/Berlin_Principles_IREG_534.pdf.

  14. Chanowitz, B., & Langer, E. J. (1981). Premature cognitive commitment. Journal of Personality and Social Psychology, 41(6), 1051–1063.

  15. Comins, J. A. (2015). Data-mining the technological importance of government-funded patents in the private sector. Scientometrics, 104(2), 425–435.

  16. CWTS-Centre for Science and Technology Studies. (2017). CWTS Leiden ranking. Retrieved 12 Dec 2017. www.leidenranking.com/ranking/2017/list.

  17. Daraio, C., Bonaccorsi, A., & Simar, L. (2015). Rankings and university performance: A conditional multidimensional approach. European Journal of Operational Research, 244(3), 918–930.

  18. Davidov, E. (2009). Measurement equivalence of nationalism and constructive patriotism in the ISSP: 34 countries in a comparative perspective. Political Analysis, 17(1), 64–82.

  19. Dehon, C., McCathie, A., & Verardi, V. (2010). Uncovering excellence in academic rankings: A closer look at the Shanghai Ranking. Scientometrics, 83(2), 515–524.

  20. Ding, J., & Qiu, J. (2011). An approach to improve the indicator weights of scientific and technological competitiveness evaluation of Chinese universities. Scientometrics, 86(2), 285–297.

  21. Docampo, D. (2011). On using the Shanghai Ranking to assess the research performance of university systems. Scientometrics, 86(1), 77–92.

  22. Docampo, D. (2013). Reproducibility of the Shanghai academic ranking of world universities results. Scientometrics, 94(2), 567–587.

  23. Docampo, D., & Cram, L. (2014). On the internal dynamics of the Shanghai Ranking. Scientometrics, 98(2), 1347–1366.

  24. Docampo, D., & Cram, L. (2017). Academic performance and institutional resources: a cross-country analysis of research universities. Scientometrics, 110(2), 739–764.

  25. Docampo, D., Egret, D., & Cram, L. (2015). The effect of university mergers on the Shanghai Ranking. Scientometrics, 104(1), 175–191.

  26. Elken, M., Hovdhaugen, E., & Stensaker, B. (2016). Global rankings in the Nordic region: Challenging the identity of research-intensive universities? Higher Education, 72(6), 781–795.

  27. Escudero, T. (2017). La fiebre con los rankings. Un riesgo para la calidad global de las instituciones universitarias [Ranking fever. A risk to the overall quality of the universities]. El País Digital, Accessed 26 June from http://elpais.com/elpais/2017/06/23/opinion/1498226306_209367.html.

  28. Fernández-Cano, A. (1995). Métodos para evaluar la investigación en Psicopedagogía [Methods for evaluating psychopedagogical research]. Madrid: Síntesis.

  29. Ferreira, C., & Vidal, J. (2017). El impacto de los rankings sobre la actividad de las universidades [The impact of rankings on universities activity]. In AIDIPE (Ed.), Actas XVIII Congreso Internacional de Investigación Educativa [Proceedings of the 18th international congress on educational research. Interdisciplinarity and transfer] (pp. 691–698). Salamanca: AIDIPE.

  30. Flórez, J. M., López, M. V., & López, A. M. (2014). El gobierno corporativo de las Universidades: Estudio de las 100 primeras Universidades del ranking de Shanghái [Corporate governance: Analysis of the top 100 universities in the Shanghai Ranking]. Revista de Educación, 364, 170–196.

  31. Florian, R. (2007). Irreproducibility of the results of the Shanghai academic ranking of world universities. Scientometrics, 72(1), 25–32.

  32. Freire, P. (1998a). The adult literacy process as cultural action for freedom. Harvard Educational Review, 68(4), 480–498. (Reprinted from Harvard Educational Review, 40, 1970).

  33. Freire, P. (1998b). Cultural action and conscientization. Harvard Educational Review, 68(4), 499–521. (Reprinted from Harvard Educational Review, 40, 1970).

  34. Freyer, L. (2014). Robust rankings: Review of multivariate assessments illustrated by the Shanghai Rankings. Scientometrics, 100(2), 391–406.

  35. Fundación BBVA-IVIE. (2017). U-ranking de las universidades españolas [U-ranking of Spanish universities]. Retrieved 23 Nov 2017. http://www.u-ranking.es/analisis.php.

  36. Grimes, D. A. (1993). Technology follies. The uncritical acceptance of medical innovation. JAMA, 269(23), 3030–3303.

  37. Gupta, B. M., & Karisiddappa, C. R. (2000). Modelling the growth of literature in the area of theoretical population genetics. Scientometrics, 49(2), 321–355.

  38. Harvard Graduate School of Education. (2017). Tuition and costs. Retrieved 19 Dec 2017. https://www.gse.harvard.edu/financialaid/tuition.

  39. Huang, F. (2015). Building the world-class research universities: A case study of China. Higher Education, 70(2), 203–215.

  40. Jabnoun, N. (2015). The influence of wealth, transparency, and democracy on the number of top ranked universities. Quality Assurance in Education, 23(2), 108–122.

  41. Jeremic, V., Bulajic, M., Martic, M., & Radojicic, Z. (2011). A fresh approach to evaluating the academic ranking of world universities. Scientometrics, 87(3), 587–596.

  42. Jovanovic, M., Jeremic, V., Savic, G., Bulajic, M., & Martic, M. (2012). How does the normalization of data affect the ARWU ranking? Scientometrics, 93(2), 319–327.

  43. Lane, S. (2014). Validity evidence based on testing consequences. Psicothema, 26(1), 127–137.

  44. López-Martín, E., Alexis Moreno-Pulido, A., & Expósito-Casas, E. (2018). Validez predictiva del u-ranking en las titulaciones universitarias de ciencias de la salud [The U-ranking’s predictive validity in university health studies]. Bordón. Revista de Pedagogía, 70(1), 57–72.

  45. Macri, J., & Sinha, D. (2006). Rankings methodology for international comparisons of institutions and individuals: An application to economics in Australia and New Zealand. Journal of Economic Surveys, 20(1), 111–156.

  46. Margison, S. (2014). University rankings and social science. European Journal of Education, 49(1), 45–59.

  47. Martínez-Rizo, F. (2011). Los rankings de universidades: Una visión crítica. [University rankings: A critical view]. Revista de la Educación Superior, 40(157), 2–21.

  48. Merton, R. K. (1968). The Matthew effect in science. Science, 159(3810), 56–63.

  49. Moed, H. F. (2017). A critical comparative analysis of five world university rankings. Scientometrics, 110(2), 967–990.

  50. Moksony, F., Hegedus, R., & Csaszar, M. (2014). Rankings, research styles, and publication cultures: A study of American sociology departments. Scientometrics, 101(3), 1715–1729.

  51. O’Connell, C. (2013). Research discourses surrounding global university rankings: Exploring the relationship with policy and practice recommendations. Higher Education, 65(6), 709–723.

  52. Piro, F. N., & Sivertsen, G. (2016). How can differences in international university rankings be explained? Scientometrics, 109(3), 2263–2278.

  53. Quacquarelli Symonds–QS. (2017). QS World University Rankings 2016-2017. Retrieved 14 Dec 2017. https://www.topuniversities.com/university-rankings/world-university-rankings/2016.

  54. Sadlak, J., & Liu, N.-C. (Eds.). (2009). The world-class university and ranking: Aiming beyond status. Bucharest: UNESCO-CEPES.

  55. Safon, V. (2013). What do global university rankings really measure? The search for the X factor and the X entity. Scientometrics, 97(2), 223–244.

  56. Scriven, M. (2009). Meta-evaluation revisited. Journal of MultiDisciplinary Evaluation, 6(11), iii–viii.

  57. Shanghai Ranking Consultancy. (2017). Academic Ranking of World Universities 2017. Retrieved 17 Oct 2017. http://www.shanghairanking.com.

  58. Shehatta, I., & Mahmood, K. (2016). Correlation among top 100 universities in the major six global rankings: policy implications. Scientometrics, 109(2), 1231–1254.

  59. Skinner, B. F. (1956). A case history in scientific method. American Psychologist, 11(5), 221–223.

  60. Sorz, J., Wallner, B., Seidler, H., & Fieder, M. (2015). Inconsistent year-to-year fluctuations limit the conclusiveness of global higher education rankings for university management. PEERJ, 3, e1217.

  61. Stufflebeam, D. (2001). The metaevaluation imperative. American Journal of Evaluation, 22(2), 183–209.

  62. The-Times Higher Education World University Ranking. (2017). Times Higher Education World University Rankings 20162017. Retrieved 8 Nov 2017. https://www.timeshighereducation.com/world-university-rankings/2017/world-ranking.

  63. Tijssen, R. J. W., Yegros-Yegros, A., & Winnink, J. J. (2016). University-industry R&D linkage metrics: validity and applicability in world university rankings. Scientometrics, 109(2), 677–696.

  64. Tofallis, C. (2012). A different approach to university rankings. Higher Education, 63(1), 1–18.

  65. Universidad de Granada. (2017). Másteres oficiales de la UGR [Oficial masters in the UGR]. Retrieved 18 Dec 2017. https://masteres.ugr.es/pages/masteres.

  66. URAP-Informatics Institute of Middle East Technical University. (2017). University Ranking by Academic Performance. Retrieved 4 Jan 2018. http://www.urapcenter.org/2017.

  67. van Raan, A. F. C. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.

  68. Vázquez, G., Murillo, F., Cabezas, J., Gómez, J., Martín, C., Chaves, J., et al. (2008). El examen MIR, su cambio como una opción estratégica [The MIR examination, the change as a strategic option]. Educación Médica, 11(4), 203–206.

  69. Virk, H. S. (2016). Shanghai Rankings 2016: Poor performance of Indian universities. Current Science, 111(4), 601.

  70. Williams, R., & de Rassenfosse, G. (2016). Pitfalls in aggregating performance measures in higher education. Studies in Higher Education, 41(1), 51–62.

  71. Zeller, R. A. (1997). Validity. In J. P. Keeves (Ed.), Educational research, methodology and measurement: An international handbook (pp. 822–829). New York: Pergamon.

Download references

Author information

Correspondence to Antonio Fernández-Cano.

Additional information

The integrative review is a methodology that provides a synthesis of knowledge and the applicability of results of significant studies to practice. Bibliographic research and the authors’ personal reflection are needed and rationally combined. Six stages are necessary when preparing a research review: forming the central question, in this case, explicitly about questioning the Shanghai methodology, searching for the relevant literature, data collection, critical examination of the studies included, discussion of results, and writing the report. For reasons of concision, this paper only contains the final stage.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Fernández-Cano, A., Curiel-Marin, E., Torralbo-Rodríguez, M. et al. Questioning the Shanghai Ranking methodology as a tool for the evaluation of universities: an integrative review. Scientometrics 116, 2069–2083 (2018). https://doi.org/10.1007/s11192-018-2814-7

Download citation

Keywords

  • Shanghai Ranking
  • Integrative review
  • Evaluation methodology
  • Universities
  • Berlin Principles