Proof over promise: towards a more inclusive ranking of Dutch academics in Economics & Business


The Dutch Economics top-40, based on publications in ISI listed journals, is—to the best of our knowledge—the oldest ranking of individual academics in Economics and is well accepted in the Dutch academic community. However, this ranking is based on publication volume, rather than on the actual impact of the publications in question. This paper therefore uses two relatively new metrics, the citations per author per year (CAY) metric and the individual annual h-index (hIa) to provide two alternative, citation-based, rankings of Dutch academics in Economics & Business. As a data source, we use Google Scholar instead of ISI to provide a more comprehensive measure of impact, including citations to and from publications in non-ISI listed journals, books, working and conference papers. The resulting rankings are shown to be substantially different from the original ranking based on publications. Just like other research metrics, the CAY or hIa-index should never be used as the sole criterion to evaluate academics. However, we do argue that the hIa-index and the related CAY metric provide an important additional perspective over and above a ranking based on publications in high impact journals alone. Citation-based rankings are also shown to inject a higher level of diversity in terms of age, gender, discipline and academic affiliation and thus appear to be more inclusive of a wider range of scholarship.

This is a preview of subscription content, log in to check access.


  1. 1.

    The term economist is interpreted more broadly in the Netherlands than in Anglophone countries. In Anglophone countries there is generally a clear separation between Economics and Business and these disciplines might be located in different Faculties or Schools. In the Netherlands, Economics is generally sub-divided into General Economics (Economics), Business Economics (Business, i.e. Management, Marketing, Finance & Accounting) and Quantitative Economics (roughly equivalent to Econometrics and Management Science). Hence the Economics top-40 includes both academics in Economics/Econometrics and Business.

  2. 2.

    ESB also publishes another citation-based ranking, the Polderparade, which is based purely on citations in Dutch magazines and as such is not relevant for our discussion.

  3. 3.

    A recent publication (Abbring et al. 2014) shows that this choice alone dramatically influences the resulting ranking. They propose an alternative publication-based ranking using the raw AIS. Only half of the academics in the original top-40 are present in this new ranking. This clearly shows how vulnerable rankings are to the choice of criteria, something we will return to in our discussion section.

  4. 4.

    Every participating university in the Netherlands (11 in total) can nominate up to 20 (for large universities) or up to 10 (for small universities) economists to be included in the Dutch Economists Top 40. Criteria for nomination include at least a 0.2 appointment and at least one publication in a recognised journal in Economics & Business to ensure the nominee has a link to this field. We received the list of nominees from the team coordinating the ranking in order to enable an impact analysis beyond WOS and to discover opportunities to innovate the methodology for the ranking from a more inclusive perspective.

  5. 5.

    Google Scholar is not without its critics (see e.g. Jacso 2010). However, recent large-scale investigations of Google Scholar accuracy (e.g., the LSE project on impact in the Social Sciences London School of Economics and Political Science 2011; Harzing 2013) suggest that the level of accuracy, stability and comprehensiveness displayed by Google Scholar is sufficient for bibliometric analyses. In the LSE project, publications listed and the citing sources were verified manually for duplicate entries, unacknowledged citations, publishers’ publicity materials etc. These were removed to produce a completely ‘cleaned’ score. The correlation between the original scores and the cleaned scores was 0.95.

  6. 6.

    A similar argument could be made for the original Economics ranking which is based on recent publications in high-impact journals. Senior academics might have a better chance of getting their papers accepted in these journals, especially if they have published in these journals before, even if the paper itself isn’t necessary of higher quality.

  7. 7.

    Please note that this is a limitation that also applies to the ISI database. Although one can limit the year range for articles, it is not possible to do so for citations.


  1. Abbring, J. H., Bronnenberg, B. J., Gautier, P. A., & van Ours, J. C. (2014). Dutch Economists top 40. De Economist, 162, 107–114.

    Google Scholar 

  2. Adler, N., & Harzing, A. W. (2009). When Knowledge Wins: Transcending the sense and nonsense of academic rankings. The Academy of Management Learning & Education, 8(1), 72–95.

    Article  Google Scholar 

  3. Börner, K., Dall’Asta, L., Ke, W., & Vespignani, A. (2005). Studying the emerging global brain: Analyzing and visualizing the impact of co-authorship teams. Complexity, 10(4), 57–67.

    Article  Google Scholar 

  4. Bornmann, L., Mutz, R., Hug, S. E., & Daniel, H. D. (2011). A multilevel meta-analysis of studies reporting correlations between the h-index and 37 different h-index variants. Journal of Informetrics, 5(3), 346–359.

    Article  Google Scholar 

  5. Franses, P. H. (2014). Trends in three decades of rankings of Dutch Economists. Scientometrics, 98(2), 1257–1268.

    Article  Google Scholar 

  6. García, J. A., Rodriguez-Sánchez, R., & Fdez-Valdivia, J. (2012). A comparison of top economics departments in the US and EU on the basis of the multidimensional prestige of influential articles in 2010. Scientometrics, 93(3), 681–698.

    Article  Google Scholar 

  7. Glänzel, W., & Thijs, B. (2004). Does co-authorship inflate the share of self-citations? Scientometrics, 61(3), 395–404.

    Article  Google Scholar 

  8. Harzing, A. W. (2005). Australian research output in Economics & Business: High volume, low impact? Australian Journal of Management, 30(2), 183–200.

    Article  Google Scholar 

  9. Harzing, A. W. (2007) Publish or Perish, Retrieved February 3, 2014, from

  10. Harzing, A. W. (2013). A preliminary test of Google Scholar as a source for citation data: A longitudinal study of Nobel Prize winners. Scientometrics, 93(3), 1057–1075.

    Article  Google Scholar 

  11. Harzing, A. W., Alakangas, S., & Adams, D. (2014). hIa: An individual annual h-index to accommodate disciplinary and career length differences. Scientometrics, 99(3), 811–821.

    Article  Google Scholar 

  12. Harzing, A. W., & van der Wal, R. (2009). A Google Scholar h-index for journals: An alternative metric to measure journal impact in Economics & Business? Journal of the American Society for Information Science and Technology, 60(1), 41–46.

    Article  Google Scholar 

  13. Jacso, P. (2010). Metadata mega mess in Google Scholar. Online Information Review, 34(1), 175–191.

    Article  Google Scholar 

  14. Jin, J. C., & Choi, E. K. (2014). Citations of Most Often Cited Economists: Do Scholarly Books Matter More than Quality Journals? Pacific Economic Review, 19(1), 8–24.

    Article  MathSciNet  Google Scholar 

  15. Judge, T. A., Cable, D. M., Colbert, A. E., & Rynes, S. L. (2007). What causes a management article to be cited—Article, author, or journal? Academy of Management Journal, 50(3), 491–506.

    Article  Google Scholar 

  16. Kalaitzidakis, P., Mamuneas, T. P., & Stengos, T. (2003). Rankings of academic journals and institutions in economics. Journal of the European Economic Association, 1(6), 1346–1366.

    Article  Google Scholar 

  17. Katz, J. S., & Martin, B. R. (1997). What is research collaboration? Research policy, 26(1), 1–18.

    Article  Google Scholar 

  18. Kodrzycki, Y. K., & Yu, P. (2006). New approaches to ranking economics journals. Contributions in Economic Analysis & Policy, 5(1), 1–40.

    Google Scholar 

  19. London School of Economics and Political Science. (2011). Impact of the social sciences: Maximizing the impact of academic research. Retrieved February 3, 2014, from

  20. Nederhof, A. J. (2008). Policy impact of bibliometric rankings of research performance of departments and individuals in economics. Scientometrics, 74(1), 163–174.

    Article  Google Scholar 

  21. Prathap, G. (2010). The 100 most prolific economists using the p-index. Scientometrics, 84(1), 167–172.

    Article  Google Scholar 

  22. Scott, L. C., & Mitias, P. M. (1996). Trends in rankings of economics departments in the US: An update. Economic inquiry, 34(2), 378–400.

    Article  Google Scholar 

  23. Seglen, P. O. (1992). The skewness of science. Journal of the American Society for Information Science, 43(9), 628–638.

    Article  Google Scholar 

  24. Singh, G., Haddad, K. M., & Chow, C. W. (2007). Are articles in “top” management journals necessarily of higher quality? Journal of Management Inquiry, 16(4), 319–331.

    Article  Google Scholar 

  25. Starbuck, W. H. (2005). How much better are the most-prestigious journals? The statistics of academic publication. Organization Science, 16(2), 180–200.

    Article  Google Scholar 

  26. Tol, R. S. (2009). The h-index and its alternatives: An application to the 100 most prolific economists. Scientometrics, 80(2), 317–324.

    Article  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to Anne-Wil Harzing.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Harzing, A., Mijnhardt, W. Proof over promise: towards a more inclusive ranking of Dutch academics in Economics & Business. Scientometrics 102, 727–749 (2015).

Download citation


  • Rankings
  • Citations
  • ISI
  • Google Scholar
  • Economics