Skip to main content
Log in

A flexible approach for measuring author-level publishing performance

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

We propose a framework to evaluate, in relative terms, author-level publishing performance. To that end we introduce the publishing performance index (PPI) and the publishing performance box (PPB), and discuss the associated publishing profiles. We illustrate our approach conducting an extensive empirical application covering 472 top economists and developing several robustness tests. Instead of using a pre-designed measure without flexibility to adjust to the circumstances of each specific case, our approach accommodates alternative evaluation criteria, as defined by the evaluators. Beyond this key characteristic, our approach has some other important advantages: (1) it is easy to apply; (2) it is sensitive to the full list of publications and citations; (3) it is able to include additional dimensions of scientific performance beyond papers and citations; (4) it is a high granularity measure, providing a complete ranking of the authors under analysis.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. Google Scholar reports 8,943 citations on the same date.

  2. For a discussion of the main reasons justifying the increasing role of collaborative work in research, see Leahey (2016). Henriksen (2018) focuses on the specific case of economics.

  3. While the number of papers is excluded from several author-level performance measures, there are valid reasons to include this dimension, as discussed by Hausken (2016).

  4. The example discussed in this Section includes mainly top researchers. When the analysis considers medium-to-low researchers, some differences may emerge. Since the differences among them in terms of scientific outputs are probably lower, the PPI index will produce more approximate values. This makes even more important the consideration of additional evaluation criteria. This procedure can be developed in two different ways. First, in the context of our framework, namely through additional rounds with new criteria for the group of authors with higher levels of scientific performance. Second, through the consideration of qualitative elements (peer review). It seems fair to say that the role of the evaluators is even more important when the group under analysis is more homogeneous.

  5. We test the uniform counting because the position of the authors in the byline is irrelevant when the alphabetical order of the names is the rule followed by a vast majority of authors and papers. This is the case of economics, in which the byline of around 90% of the multi-authored papers follow the alphabetical order (Kadel and Walter 2015). We conduct a similar analysis for our sample (11,230 multi-authored papers) and find a roughly similar value (91.38%).

  6. Our sample comprises a total of more than 22 million indirect citations.

  7. This procedure can, of course, be extended to more than four dimensions.

  8. The criteria used for the selection process should, of course, be defined a priori.

References

  • Abramo, G., & D’Angelo, C. A. (2014). How do you define and measure research productivity? Scientometrics,101, 1129–1144.

    Google Scholar 

  • Abt, H. (2012). A publication index that is independent of age. Scientometrics,91, 863–868.

    Google Scholar 

  • Alonso, Sergio, Cabrerizo, Francisco, Herrera-Viedmac, Enrique, & Herrera, Francisco. (2009). h-index: A review focused in its variants, computation and standardization for different scientific fields. Journal of Informetrics,3, 273–289.

    Google Scholar 

  • Alonso, Sergio, Cabrerizo, Francisco, Herrera-Viedmac, Enrique, & Herrera, Francisco. (2010). hg-index: A new index to characterize the scientific output of researchers based on the h-and g-indices. Scientometrics,82, 391–400.

    Google Scholar 

  • Amjad, Tehmina, & Daud, Ali. (2017). Indexing of authors according to their domain of expertise. Malaysian Journal of Library & Information Science,22, 69–82.

    Google Scholar 

  • Ball, Philip. (2005). Index aims for fair ranking of scientists. Nature,436, 900.

    Google Scholar 

  • Bergstrom, Carl, West, Jevin, & Wiseman, Marc. (2008). The Eigenfactor™ metrics. Journal of Neuroscience,28, 11433–11434.

    Google Scholar 

  • Bornmann, Lutz, Butz, Alexander, & Wohlrabe, Klaus. (2018). What are the top five journals in economics? A new meta-ranking. Applied Economics,50, 659–675.

    Google Scholar 

  • Bornmann, Lutz, & Haunschild, Robin. (2018). Plots for visualizing paper impact and journal impact of single researchers in a single graph. Scientometrics,115, 385–394.

    Google Scholar 

  • Bornmann, Lutz, & Marx, Werner. (2011). The h index as a research performance indicator. European Science Editing,37, 77–80.

    Google Scholar 

  • Bornmann, Lutz, Mutz, Rüdiger, & Daniel, Hans-Dieter. (2008). Are there better indices for evaluation purposes than the h index? A comparison of nine different variants of the h index using data from biomedicine. Journal of the American Society for Information Science and Technology,59, 830–837.

    Google Scholar 

  • Bornmann, Lutz, Thor, Andreas, Marx, Werner, & Schier, Hermann. (2016). The application of bibliometrics to research evaluation in the humanities and social sciences: An exploratory study using normalized google scholar data for the publications of a research institute. Journal of the Association for Information Science and Technology,67, 2778–2789.

    Google Scholar 

  • da Silva, J., & Dobránszki, J. (2018). Multiple versions of the h-index: Cautionary use for formal academic purposes. Scientometrics,115, 1107–1113.

    Google Scholar 

  • Ding, Ying. (2011). Applying weighted pagerank to author citation networks. Journal of the American Society for Information Science and Technology,62, 236–245.

    Google Scholar 

  • Dunnick, Nicholas. (2017). The h index in perspective. Academic Radiology,24, 117–118.

    Google Scholar 

  • Egghe, Leo. (2006). Theory and practise of the g-index. Scientometrics,69, 131–152.

    Google Scholar 

  • Egghe, Leo. (2010). The Hirsch index and related impact measures. Annual Review of Information Science and Technology,44, 65–114.

    Google Scholar 

  • Fenner, Trevor, Harris, Martyn, Levene, Mark, & Bar-Ilan, Judit. (2018). A novel bibliometric index with a simple geometric interpretation. PLoS ONE,13, e0200098.

    Google Scholar 

  • Frandsen, Tove, & Nicolaisen, Jeppe. (2010). What is in a name? Credit assignment practices in different disciplines. Journal of Informetrics,4, 608–617.

    Google Scholar 

  • Fu, Hui-Zhen, Wang, Ming-Huang, & Ho, Yuh-Shan. (2012). The most frequently cited adsorption research articles in the science citation index (expanded). Journal of Colloid and Interface Science,379, 148–156.

    Google Scholar 

  • Gao, Chao, Wang, Zhen, Li, Xianghua, Zhang, Zili, & Zeng, Wei. (2016). PR-index: using the h-index and pagerank for determining true impact. PLoS ONE,11, e0161755.

    Google Scholar 

  • Glänzel, Wolfgang, Debackere, Koenraad, Thijs, Bart, & Schubert, András. (2006). A concise review on the role of author self-citations in information science, bibliometrics and science policy. Scientometrics,67, 263–277.

    Google Scholar 

  • Hamermesh, Daniel. (2018). Citations in economics: Measurement, uses, and impacts. Journal of Economic Literature,56, 115–156.

    Google Scholar 

  • Hammarfelt, Björn, & Rushforth, Alexander. (2017). Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation. Research Evaluation,26, 169–180.

    Google Scholar 

  • Hausken, Kjell. (2016). The ranking of researchers by publications and citations. Journal of Economics Bibliography,3, 530–558.

    Google Scholar 

  • Henriksen, Dorte. (2018). What factors are associated with increasing co-authorship in the social sciences? A Case study of Danish economics and political science. Scientometrics,114, 1395–1421.

    Google Scholar 

  • Hicks, Diana, Wouters, P., Waltman, L., De Rijcke, S., & Rafols, I. (2015). Bibliometric: The Leiden Manifesto for research metrics. Nature News,520, 429.

    Google Scholar 

  • Hirsch, Jorge. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences,102, 16569–16572.

    MATH  Google Scholar 

  • Hirsch, Jorge. (2019). hα: An index to quantify an individual’s scientific leadership. Scientometrics,118, 673–686.

    Google Scholar 

  • Iglesias, Juan, & Pecharromán, Carlos. (2007). Scaling the h-index for different scientific ISI fields. Scientometrics,73, 303–320.

    Google Scholar 

  • Järvelin, Kalervo, & Persson, Olle. (2008). The DCI index: Discounted cumulated impact-based research evaluation. Journal of the American Society for Information Science and Technology,59, 1433–1440.

    Google Scholar 

  • Kadel, Annke, & Walter, Andreas. (2015). Do scholars in economics and finance react to alphabetical discrimination? Finance Research Letters,14, 64–68.

    Google Scholar 

  • Kosmulski, Marek. (2018). Are you in top 1% (1‰)? Scientometrics,114, 557–565.

    Google Scholar 

  • Larivière, Vincent, Ni, Chaoqun, Gingras, Yves, Cronin, Blaise, & Sugimoto, Cassidy. (2013). Global gender disparities in science. Nature,504, 211–213.

    Google Scholar 

  • Leahey, Erin. (2016). From sole investigator to team scientist: Trends in the practice and study of research collaboration. Annual Review of Sociology,42, 81–100.

    Google Scholar 

  • Liu, Xuan, & Fang, Hui. (2012). Modifying h-index by allocating credit of multi-authored papers whose author names rank based on contribution. Journal of Informetrics,6, 557–565.

    Google Scholar 

  • Martín-Martín, A., Orduna-Malea, E., Thelwall, M., & López-Cózar, E. D. (2018). Google scholar, web of science, and scopus: A systematic comparison of citations in 252 subject categories. Journal of Informetrics,12, 1160–1177.

    Google Scholar 

  • Mayer, Sabrina, & Rathmann, Justus. (2018). How does research productivity relate to gender? Analyzing gender differences for multiple publication dimensions. Scientometrics,117, 1663–1693.

    Google Scholar 

  • Mazurek, Jirí. (2018). A modification to Hirsch index allowing comparisons across different scientific fields. Current Science,114, 2238–2239.

    Google Scholar 

  • Moed, Henk. (2007). The future of research evaluation rests with an intelligent combination of advanced metrics and transparent peer review. Science and Public Policy,34, 575–583.

    Google Scholar 

  • Osório, António. (2018). On the impossibility of a perfect counting method to allocate the credits of multi-authored publications. Scientometrics,116, 2161–2173.

    Google Scholar 

  • Perry, Motty, & Reny, Philip. (2016). How to count citations if you must. American Economic Review,106, 2722–2741.

    Google Scholar 

  • Prins, A., Costas, R., van Leeuwen, T., & Wouters, P. (2016). Using google scholar in research evaluation of humanities and social science programs: A comparison with web of science data. Research Evaluation,25, 264–270.

    Google Scholar 

  • Schreiber, Michael. (2018). A skeptical view on the Hirsch index and its predictive power. Physica Scripta,93, 102501.

    Google Scholar 

  • Todeschini, Roberto, & Baccini, Alberto. (2016). Handbook of bibliometric indicators: Quantitative tools for studying and evaluating research. Weinheim: Wiley.

    MATH  Google Scholar 

  • Van Raan, A. (2006). Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups. Scientometrics,67, 491–502.

    Google Scholar 

  • Waltman, Ludo, & Costas, Rodrigo. (2014). F 1000 recommendations as a potential new data source for research evaluation: A comparison with citations. Journal of the Association for Information Science and Technology,65, 433–445.

    Google Scholar 

  • Wildgaard, Lorna, Schneider, Jesper, & Larsen, Birger. (2014). A review of the characteristics of 108 author-level bibliometric indicators. Scientometrics,101, 125–158.

    Google Scholar 

  • Wouters, P., Thelwall, M., Kousha, K., Waltman, L, de Rijcke, S., Rushforth, A., & Franssen, T. (2015). The metric tide: Literature review (supplementary Report I to the independent review of the role of metrics in research assessment and management) Bristol: HEFCE.

  • Wu, Qiang. (2010). The w-index: A measure to assess scientific impact by focusing on widely cited papers. Journal of the American Society for Information Science and Technology,61, 609–614.

    Google Scholar 

  • Wuchty, Stefan, Jones, Benjamin, & Uzzi, Brian. (2007). The increasing dominance of teams in production of knowledge. Science,316, 1036–1039.

    Google Scholar 

  • Zhang, Chun-Ting. (2009). The e-index, complementing the h-index for excess citations. PLoS ONE,4, e5429.

    Google Scholar 

Download references

Acknowledgements

This work was supported by the Fundação para a Ciência e a Tecnologia under Grant UID/GES/00,315/2019. We are grateful to the two anonymous referees for their very useful comments. The usual disclaimer applies.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nuno Crespo.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Simoes, N., Crespo, N. A flexible approach for measuring author-level publishing performance. Scientometrics 122, 331–355 (2020). https://doi.org/10.1007/s11192-019-03278-7

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-019-03278-7

Keywords

JEL Classification

Navigation