Advertisement

A flexible approach for measuring author-level publishing performance

  • Nadia Simoes
  • Nuno CrespoEmail author
Article
  • 114 Downloads

Abstract

We propose a framework to evaluate, in relative terms, author-level publishing performance. To that end we introduce the publishing performance index (PPI) and the publishing performance box (PPB), and discuss the associated publishing profiles. We illustrate our approach conducting an extensive empirical application covering 472 top economists and developing several robustness tests. Instead of using a pre-designed measure without flexibility to adjust to the circumstances of each specific case, our approach accommodates alternative evaluation criteria, as defined by the evaluators. Beyond this key characteristic, our approach has some other important advantages: (1) it is easy to apply; (2) it is sensitive to the full list of publications and citations; (3) it is able to include additional dimensions of scientific performance beyond papers and citations; (4) it is a high granularity measure, providing a complete ranking of the authors under analysis.

Keywords

Research evaluation Scientific outputs Bibliometrics Selection criteria Evaluators 

JEL Classification

A11 A14 I23 M51 

Notes

Acknowledgements

This work was supported by the Fundação para a Ciência e a Tecnologia under Grant UID/GES/00,315/2019. We are grateful to the two anonymous referees for their very useful comments. The usual disclaimer applies.

References

  1. Abramo, G., & D’Angelo, C. A. (2014). How do you define and measure research productivity? Scientometrics, 101, 1129–1144.CrossRefGoogle Scholar
  2. Abt, H. (2012). A publication index that is independent of age. Scientometrics, 91, 863–868.CrossRefGoogle Scholar
  3. Alonso, Sergio, Cabrerizo, Francisco, Herrera-Viedmac, Enrique, & Herrera, Francisco. (2009). h-index: A review focused in its variants, computation and standardization for different scientific fields. Journal of Informetrics, 3, 273–289.CrossRefGoogle Scholar
  4. Alonso, Sergio, Cabrerizo, Francisco, Herrera-Viedmac, Enrique, & Herrera, Francisco. (2010). hg-index: A new index to characterize the scientific output of researchers based on the h-and g-indices. Scientometrics, 82, 391–400.CrossRefGoogle Scholar
  5. Amjad, Tehmina, & Daud, Ali. (2017). Indexing of authors according to their domain of expertise. Malaysian Journal of Library & Information Science, 22, 69–82.CrossRefGoogle Scholar
  6. Ball, Philip. (2005). Index aims for fair ranking of scientists. Nature, 436, 900.CrossRefGoogle Scholar
  7. Bergstrom, Carl, West, Jevin, & Wiseman, Marc. (2008). The Eigenfactor™ metrics. Journal of Neuroscience, 28, 11433–11434.CrossRefGoogle Scholar
  8. Bornmann, Lutz, Butz, Alexander, & Wohlrabe, Klaus. (2018). What are the top five journals in economics? A new meta-ranking. Applied Economics, 50, 659–675.CrossRefGoogle Scholar
  9. Bornmann, Lutz, & Haunschild, Robin. (2018). Plots for visualizing paper impact and journal impact of single researchers in a single graph. Scientometrics, 115, 385–394.CrossRefGoogle Scholar
  10. Bornmann, Lutz, & Marx, Werner. (2011). The h index as a research performance indicator. European Science Editing, 37, 77–80.Google Scholar
  11. Bornmann, Lutz, Mutz, Rüdiger, & Daniel, Hans-Dieter. (2008). Are there better indices for evaluation purposes than the h index? A comparison of nine different variants of the h index using data from biomedicine. Journal of the American Society for Information Science and Technology, 59, 830–837.CrossRefGoogle Scholar
  12. Bornmann, Lutz, Thor, Andreas, Marx, Werner, & Schier, Hermann. (2016). The application of bibliometrics to research evaluation in the humanities and social sciences: An exploratory study using normalized google scholar data for the publications of a research institute. Journal of the Association for Information Science and Technology, 67, 2778–2789.CrossRefGoogle Scholar
  13. da Silva, J., & Dobránszki, J. (2018). Multiple versions of the h-index: Cautionary use for formal academic purposes. Scientometrics, 115, 1107–1113.CrossRefGoogle Scholar
  14. Ding, Ying. (2011). Applying weighted pagerank to author citation networks. Journal of the American Society for Information Science and Technology, 62, 236–245.CrossRefGoogle Scholar
  15. Dunnick, Nicholas. (2017). The h index in perspective. Academic Radiology, 24, 117–118.CrossRefGoogle Scholar
  16. Egghe, Leo. (2006). Theory and practise of the g-index. Scientometrics, 69, 131–152.CrossRefGoogle Scholar
  17. Egghe, Leo. (2010). The Hirsch index and related impact measures. Annual Review of Information Science and Technology, 44, 65–114.CrossRefGoogle Scholar
  18. Fenner, Trevor, Harris, Martyn, Levene, Mark, & Bar-Ilan, Judit. (2018). A novel bibliometric index with a simple geometric interpretation. PLoS ONE, 13, e0200098.CrossRefGoogle Scholar
  19. Frandsen, Tove, & Nicolaisen, Jeppe. (2010). What is in a name? Credit assignment practices in different disciplines. Journal of Informetrics, 4, 608–617.CrossRefGoogle Scholar
  20. Fu, Hui-Zhen, Wang, Ming-Huang, & Ho, Yuh-Shan. (2012). The most frequently cited adsorption research articles in the science citation index (expanded). Journal of Colloid and Interface Science, 379, 148–156.CrossRefGoogle Scholar
  21. Gao, Chao, Wang, Zhen, Li, Xianghua, Zhang, Zili, & Zeng, Wei. (2016). PR-index: using the h-index and pagerank for determining true impact. PLoS ONE, 11, e0161755.CrossRefGoogle Scholar
  22. Glänzel, Wolfgang, Debackere, Koenraad, Thijs, Bart, & Schubert, András. (2006). A concise review on the role of author self-citations in information science, bibliometrics and science policy. Scientometrics, 67, 263–277.CrossRefGoogle Scholar
  23. Hamermesh, Daniel. (2018). Citations in economics: Measurement, uses, and impacts. Journal of Economic Literature, 56, 115–156.CrossRefGoogle Scholar
  24. Hammarfelt, Björn, & Rushforth, Alexander. (2017). Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation. Research Evaluation, 26, 169–180.CrossRefGoogle Scholar
  25. Hausken, Kjell. (2016). The ranking of researchers by publications and citations. Journal of Economics Bibliography, 3, 530–558.Google Scholar
  26. Henriksen, Dorte. (2018). What factors are associated with increasing co-authorship in the social sciences? A Case study of Danish economics and political science. Scientometrics, 114, 1395–1421.CrossRefGoogle Scholar
  27. Hicks, Diana, Wouters, P., Waltman, L., De Rijcke, S., & Rafols, I. (2015). Bibliometric: The Leiden Manifesto for research metrics. Nature News, 520, 429.CrossRefGoogle Scholar
  28. Hirsch, Jorge. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences, 102, 16569–16572.zbMATHCrossRefGoogle Scholar
  29. Hirsch, Jorge. (2019). hα: An index to quantify an individual’s scientific leadership. Scientometrics, 118, 673–686.CrossRefGoogle Scholar
  30. Iglesias, Juan, & Pecharromán, Carlos. (2007). Scaling the h-index for different scientific ISI fields. Scientometrics, 73, 303–320.CrossRefGoogle Scholar
  31. Järvelin, Kalervo, & Persson, Olle. (2008). The DCI index: Discounted cumulated impact-based research evaluation. Journal of the American Society for Information Science and Technology, 59, 1433–1440.CrossRefGoogle Scholar
  32. Kadel, Annke, & Walter, Andreas. (2015). Do scholars in economics and finance react to alphabetical discrimination? Finance Research Letters, 14, 64–68.CrossRefGoogle Scholar
  33. Kosmulski, Marek. (2018). Are you in top 1% (1‰)? Scientometrics, 114, 557–565.CrossRefGoogle Scholar
  34. Larivière, Vincent, Ni, Chaoqun, Gingras, Yves, Cronin, Blaise, & Sugimoto, Cassidy. (2013). Global gender disparities in science. Nature, 504, 211–213.CrossRefGoogle Scholar
  35. Leahey, Erin. (2016). From sole investigator to team scientist: Trends in the practice and study of research collaboration. Annual Review of Sociology, 42, 81–100.CrossRefGoogle Scholar
  36. Liu, Xuan, & Fang, Hui. (2012). Modifying h-index by allocating credit of multi-authored papers whose author names rank based on contribution. Journal of Informetrics, 6, 557–565.CrossRefGoogle Scholar
  37. Martín-Martín, A., Orduna-Malea, E., Thelwall, M., & López-Cózar, E. D. (2018). Google scholar, web of science, and scopus: A systematic comparison of citations in 252 subject categories. Journal of Informetrics, 12, 1160–1177.CrossRefGoogle Scholar
  38. Mayer, Sabrina, & Rathmann, Justus. (2018). How does research productivity relate to gender? Analyzing gender differences for multiple publication dimensions. Scientometrics, 117, 1663–1693.CrossRefGoogle Scholar
  39. Mazurek, Jirí. (2018). A modification to Hirsch index allowing comparisons across different scientific fields. Current Science, 114, 2238–2239.Google Scholar
  40. Moed, Henk. (2007). The future of research evaluation rests with an intelligent combination of advanced metrics and transparent peer review. Science and Public Policy, 34, 575–583.CrossRefGoogle Scholar
  41. Osório, António. (2018). On the impossibility of a perfect counting method to allocate the credits of multi-authored publications. Scientometrics, 116, 2161–2173.CrossRefGoogle Scholar
  42. Perry, Motty, & Reny, Philip. (2016). How to count citations if you must. American Economic Review, 106, 2722–2741.CrossRefGoogle Scholar
  43. Prins, A., Costas, R., van Leeuwen, T., & Wouters, P. (2016). Using google scholar in research evaluation of humanities and social science programs: A comparison with web of science data. Research Evaluation, 25, 264–270.CrossRefGoogle Scholar
  44. Schreiber, Michael. (2018). A skeptical view on the Hirsch index and its predictive power. Physica Scripta, 93, 102501.CrossRefGoogle Scholar
  45. Todeschini, Roberto, & Baccini, Alberto. (2016). Handbook of bibliometric indicators: Quantitative tools for studying and evaluating research. Weinheim: Wiley.zbMATHCrossRefGoogle Scholar
  46. Van Raan, A. (2006). Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups. Scientometrics, 67, 491–502.CrossRefGoogle Scholar
  47. Waltman, Ludo, & Costas, Rodrigo. (2014). F 1000 recommendations as a potential new data source for research evaluation: A comparison with citations. Journal of the Association for Information Science and Technology, 65, 433–445.CrossRefGoogle Scholar
  48. Wildgaard, Lorna, Schneider, Jesper, & Larsen, Birger. (2014). A review of the characteristics of 108 author-level bibliometric indicators. Scientometrics, 101, 125–158.CrossRefGoogle Scholar
  49. Wouters, P., Thelwall, M., Kousha, K., Waltman, L, de Rijcke, S., Rushforth, A., & Franssen, T. (2015). The metric tide: Literature review (supplementary Report I to the independent review of the role of metrics in research assessment and management) Bristol: HEFCE.Google Scholar
  50. Wu, Qiang. (2010). The w-index: A measure to assess scientific impact by focusing on widely cited papers. Journal of the American Society for Information Science and Technology, 61, 609–614.Google Scholar
  51. Wuchty, Stefan, Jones, Benjamin, & Uzzi, Brian. (2007). The increasing dominance of teams in production of knowledge. Science, 316, 1036–1039.CrossRefGoogle Scholar
  52. Zhang, Chun-Ting. (2009). The e-index, complementing the h-index for excess citations. PLoS ONE, 4, e5429.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2019

Authors and Affiliations

  1. 1.ISCTE Business School Economics Department, BRU - IUL (Business Research Unit)Instituto Universitário de Lisboa (ISCTE - IUL)LisbonPortugal

Personalised recommendations