, Volume 107, Issue 3, pp 1171–1193 | Cite as

Designing a Composite Index for research performance evaluation at the national or regional level: ranking Central Universities in India

  • Aparna Basu
  • Sumit Kumar Banshal
  • Khushboo Singhal
  • Vivek Kumar Singh


It is now generally accepted that institutions of higher education and research, largely publicly funded, need to be subjected to some benchmarking process or performance evaluation. Currently there are several international ranking exercises that rank institutions at the global level, using a variety of performance criteria such as research publication data, citations, awards and reputation surveys etc. In these ranking exercises, the data are combined in specified ways to create an index which is then used to rank the institutions. These lists are generally limited to the top 500–1000 institutions in the world. Further, some criteria (e.g., the Nobel Prize), used in some of the ranking exercises, are not relevant for the large number of institutions that are in the medium range. In this paper we propose a multidimensional ‘Quality–Quantity’ Composite Index for a group of institutions using bibliometric data, that can be used for ranking and for decision making or policy purposes at the national or regional level. The index is applied here to rank Central Universities in India. The ranks obtained compare well with those obtained with the h-index and partially with the size-dependent Leiden ranking and University Ranking by Academic Performance. A generalized model for the index using other variables and variable weights is proposed.


National ranking Research competitiveness Research ranking Scientometrics University Ranking India 



This work is supported by research grants from Department of Science and Technology, Government of India (Grant: INT/MEXICO/P-13/2012) and University Grants Commission of India [Grant: F. No. 41-624/2012(SR)].

Compliance with ethical standards

Conflict of interest



  1. Academic Ranking of World Universities. (2011). Ranking methodology. Available at: Accessed May 11, 2015.
  2. Aguillo, I. F., Ortega, J. L., & Fernández, M. (2008). Webometric ranking of world universities: Introduction, methodology, and future developments. Higher Education in Europe, 33(2–3), 233–244.CrossRefGoogle Scholar
  3. Basu, A., & Aggarwal, R. (2001). International collaboration in science in India and its impact on institutional performance. Scientometrics, 52(3), 379–394.CrossRefGoogle Scholar
  4. Basu, A., Aggarwal, R., Kumar, D., Kathuria, G., Jha, S., & Rani, T. (2000). Scientific productivity…where do we stand? Second brief report on bibliometric indicators of Indianscience, NISTADS Report, January 2000.Google Scholar
  5. Basu, A., & Nagpaul, P. S. (1998). National mapping of science. NISTADS Report # REP-248/98.Google Scholar
  6. Billaut, J. C., Bouyssou, D., & Vincke, P. (2010). Should you believe in the Shanghai ranking? An MCDM view. Scientometrics, 84(1), 237–263.CrossRefGoogle Scholar
  7. Bornmann, L. (2013). How to analyze percentile impact data meaningfully in bibliometrics? The statistical analysis of distributions, percentile rank classes, and top-cited papers. Journal of the Association for Information Science and Technology, 64(3), 587–595.CrossRefGoogle Scholar
  8. Bornmann, L., Bowman, B. F., Bauer, J., Marx, W., Schier, H., & Palzenberger, M. (2014), Bibliometric standards for evaluating research institutes in the natural sciences. In B. Cronin, C. R. Sugimoto (Eds.), Beyond bibliometrics (pp. 201–224). Massachusetts Institute of Technology. ISBN: 978-0-262-525551-0.Google Scholar
  9. Bornmann, L., & Marx, W. (2014). How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations. Scientometrics, 98(1), 487–509.CrossRefGoogle Scholar
  10. Bornmann, L., Moritz, S., Felix de Moya, A., & Rüdiger, M. (2014b). What is the effect of country-specific characteristics on the research performance of scientific institutions? Using multi-level statistical models to rank and map universities and research-focused institutions worldwide. Journal of Informetrics, 8(3), 581–593.CrossRefGoogle Scholar
  11. Bornmann, L., Moritz, S., Felix de Moya, A., & Rüdiger, M. (2015). Ranking and mapping of universities and research-focused institutions worldwide: The third release of Collnet Journal of Scientometrics and Information Management, 9(1), 65–72.CrossRefGoogle Scholar
  12. Bradford, S. C. (1934). Sources of information on specific subjects. Engineering, 137, 85–86. Reprinted in 1985 in Journal of Information Science, 10(4), 173–180.Google Scholar
  13. Costas, R., & Bordons, M. (2007). The h-index: Advantages, limitations and its relation with other bibliometric indicators at the micro level, Special Issue on h-index. Journal of Informetrics, 1(3), 193–203.CrossRefGoogle Scholar
  14. García, J. A., Rodriguez-Sánchez, R., Fdez-Valdivia, J., Torres-Salinas, D., & Herrera, F. (2012). Ranking of research output of universities on the basis of the multidimensional prestige of influential fields: Spanish universities as a case of study. Scientometrics, 93(3), 1081–1099.CrossRefGoogle Scholar
  15. Geraci, M., & Esposti, D. M. (2011). Where do Italian universities stand? An in depth statistical analysis of national and international rankings. Scientometrics, 87(3), 667–681.CrossRefGoogle Scholar
  16. Glänzel, W. (2001). National characteristics in international scientific co-authorship relations. Scientometrics, 51(1), 69–115.CrossRefGoogle Scholar
  17. Gupta, B. M. (2011). Indian S&T during 15 years (1996–2010): A quantitative assessment using publication data. DESIDOC Journal of Library and Information Technology, 31(5), 359–370.CrossRefGoogle Scholar
  18. Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. PNAS, 102(46), 16569–16572.CrossRefGoogle Scholar
  19. Huang, M.-H. (2012). Exploring the h-index at the institutional level: A practical application in world university rankings. Online Information Review, 36(4), 534–547.CrossRefGoogle Scholar
  20. Huang, M. H., & Chi, P. S. (2010). A comparative analysis of the application of h-index, g-index, and a-index in institutional-level research. Journal of Library and Information Studies, 8(2), 1–10.Google Scholar
  21. Huang, M.-H., & Lin, C. S. (2011). Counting methods and university ranking by h-index, ASIST 2011, October 9–13, 2011, New Orleans, LA, USA.Google Scholar
  22. Jeremic, V., Bulajic, M., Martic, M., & Radojicic, Z. (2011). A fresh approach to evaluating the academic ranking of world universities. Scientometrics, 87(3), 587–596.CrossRefGoogle Scholar
  23. Lazaridis, T. (2010). Ranking university departments using the mean h-index. Scientometrics, 82(2), 211–216.CrossRefGoogle Scholar
  24. Leydesdorff, L., & Opthof, T. (2010). Normalization, CWTS indicators, and the Leiden rankings: Differences in citation behavior at the level of fields. arXiv preprint arXiv:1003.3977.
  25. Liu, N. C., & Cheng, Y. (2005). The Academic Ranking of World Universities. Higher Education in Europe, 30(2), 127–136.CrossRefGoogle Scholar
  26. Liu, N. C., Cheng, Y., & Liu, L. (2005). Academic ranking of world universities using scientometrics: A comment to the Fatal Attraction. Scientometrics, 64(1), 101–109.CrossRefGoogle Scholar
  27. Liu, N. C., & Liu, L. (2005). University rankings in China. Higher Education in Europe30(2), 217–227.CrossRefGoogle Scholar
  28. Matthews, A. P. (2012). South African universities in world rankings. Scientometrics, 92(3), 675–695.MathSciNetCrossRefGoogle Scholar
  29. Molinari, A., & Molinari, J. (2008). Mathematical aspects of a new criterion for ranking scientific institutions based on the h-index. Scientometrics, 75(2), 339–356.MathSciNetCrossRefGoogle Scholar
  30. National Institutional Ranking Framework, NIRF. (2015). A methodology for ranking of universities and colleges in India. Ministry of Human Resource Development, MHRD Accessed January 2016.
  31. Nishy, P., Panwar, Y., Prasad, S., Mandal, G., & Prathap, G. (2012). An impact-citations-exergy (iCX) trajectory analysis of leading research institutions in India. Scientometrics, 91(1), 245–251.CrossRefGoogle Scholar
  32. Singh, V. K., Uddin, A., & Pinto, D. (2015). Computer science research: The Top 100 institutions in India and in the world. Scientometrics, 104(2), 529–553.CrossRefGoogle Scholar
  33. Torres Salinas, D., Moreno-Torres, J. G., Delgado-Lopez-Cozar, E., & Herrera, F. (2011). A methodology for institution-field ranking based on a bi-dimensional analysis. Scientometrics, 88(3), 771–786.CrossRefGoogle Scholar
  34. Uddin, A., Bhoosreddy, J., Tiwari, M., & Singh, V. K. (2016). A sciento-text framework to characterize research strength of institutions at fine-grained thematic area level. Scientometrics, 106(3), 1135–1150.CrossRefGoogle Scholar
  35. Uddin, A., & Singh, V. K. (2015). A Quantity–Quality composite ranking of Indian Institutions in CS Research. IETE Technical Review, 32(4), 273–283.CrossRefGoogle Scholar
  36. UGC Note on Vacant Positions. (2014). Accessed on June 18, 2014.
  37. Van Raan, A. F. J. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.CrossRefGoogle Scholar
  38. Vinkler, P. (2006). Composite scientometric indicators for evaluating publications of research institutes. Scientometrics, 68(3), 629–642.CrossRefGoogle Scholar
  39. Waltman, L., Calero-Medina, C., Kosten, J., Noyons, E. C. M., Tijssen, R. J. W., Van Eck, N. J., et al. (2012). The Leiden ranking 2011/2012: Data collection, indicators and interpretation. Journal of the American Society of Information Science and Technology, 63(12), 2419–2432.CrossRefGoogle Scholar
  40. Waltman, L., & Van Eck, N. J. (2012). A new methodology for constructing a publication level classification system of Science. Journal of the American Society of Information Science and Technology, 63(12), 2378–2392.CrossRefGoogle Scholar
  41. Waltman, L., Van Eck, N. J., Leeuwen, T. N., Visser, M. S., & Van Raan, A. F. J. (2011). Towards a new crown indicator: An empirical analysis. Scientometrics, 87(3), 467–481.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2016

Authors and Affiliations

  • Aparna Basu
    • 1
  • Sumit Kumar Banshal
    • 2
  • Khushboo Singhal
    • 2
  • Vivek Kumar Singh
    • 3
  1. 1.CSIR-NISTADSNew DelhiIndia
  2. 2.Department of Computer ScienceSouth Asian UniversityNew DelhiIndia
  3. 3.Department of Computer ScienceBanaras Hindu UniversityVaranasiIndia

Personalised recommendations