Abstract
There is a worldwide trend towards application of bibliometric research evaluation, in support of the needs of policy makers and research administrators. However the assumptions and limitations of bibliometric measurements suggest a probabilistic rather than the traditional deterministic approach to the assessment of research performance. The aim of this work is to propose a multivariate stochastic model for measuring the performance of individual scientists and to compare the results of its application with those arising from a deterministic approach. The dataset of the analysis covers the scientific production indexed in Web of Science for the 2006–2010 period, of over 900 Italian academic scientists working in two distinct fields of the life sciences.
Similar content being viewed by others
Notes
Although the overall coverage achieved by the two databases does differ significantly, evidence suggests that with respect to comparisons at large scale level in the hard sciences, the use of either source yields similar results (Archambault et al. 2009).
The complete list is accessible on http://attiministeriali.miur.it/UserFiles/115.htm, last accessed on September 15, 2014.
Mathematics and computer sciences; physics; chemistry; earth sciences; biology; medicine; agricultural and veterinary sciences; civil engineering; industrial and information engineering.
A thorough description of the formula, the underlying theory, assumptions and limits is found in Abramo and D’Angelo (2014).
A preceding article by the same authors demonstrated that the average of the distribution of citations received for all cited publications of the same year and subject category is the most effective scaling factor (Abramo et al. 2012a).
The weighting values were assigned following advice from prestigious Italian representatives of the scientific community in the life sciences. The values could be changed to suit different practices in other national contexts.
When articles are published in multidisciplinary journals we assign them to the subject category where they rank the highest.
http://cercauniversita.cineca.it/php5/docenti/cerca.php, last accessed on September 15, 2014.
The weights applied may be changed according to the specific objectives of the evaluation.
References
Abramo, G., Cicero, T., & D’Angelo, C. A. (2012a). Revisiting the scaling of citations for research assessment. Journal of Informetrics, 6(4), 470–479.
Abramo, G., Cicero, T., & D’Angelo, C. A. (2012b). Revisiting size effects in higher education research productivity. Higher Education, 63(6), 701–717.
Abramo, G., & D’Angelo, C. A. (2014). How do you define and measure research productivity? Scientometrics. doi:10.1007/s11192-014-1269-8.
Abramo, G., D’Angelo, C. A., & Di Costa, F. (2008). Assessment of sectoral aggregation distortion in research productivity measurements. Research Evaluation, 17(2), 111–121.
Abramo, G., D’Angelo, C. A., & Di Costa, F. (2013a). Investigating returns to scope of research fields in universities. Higher Education. doi:10.1007/s10734-013-9685-x.
Abramo, G., D’Angelo, C. A., & Rosati, F. (2013b). The importance of accounting for the number of co-authors and their order when assessing research performance at the individual level in the life sciences. Journal of Informetrics, 7(1), 198–208.
Aguzzi, J., Costa, C., Antonucci, F., Company, J. B., Menesatti, P., & Sardá, F. (2009). Influence of diel behaviour in the morphology of decapod natantia. Biological Journal of the Linnean Society, 96, 517–532.
Amat, C. B. (2008). Editorial and publication delay of papers submitted to 14 selected Food Research journals. Influence of online posting. Scientometrics, 74(3), 379–389.
Archambault, É., Campbell, D., Gingras, Y., & Larivière, V. (2009). Comparing bibliometric statistics obtained from the Web of Science and Scopus. Journal of the American Society for Information Science and Technology, 60(7), 1320–1326.
Billaut, J. C., Bouyssou, D., & Vincke, P. (2010). Should you believe in the Shanghai ranking? An MCDM view. Scientometrics, 84, 237–263.
Burrell, Q. L. (2007). Hirsch’s h-index: A stochastic model. Journal of Informetrics, 1, 16–25.
Butler, L. (2007). Assessing university research: A plea for a balanced approach. Science and Public Policy, 34(8), 565–574.
Casale, M., Armanino, C., Casolino, C., & Forina, M. (2007). Combining information from headspace mass spectrometry and visible spectroscopy in the classification of the Ligurian olive oils. Analytica Chimica Acta, 589(1), 89–95.
Cerchiello, P., & Giudici, P. (2014). On a statistical h index. Scientometrics, 99, 299–312.
Costa, C., Menesatti, P., & Spinelli, R. (2012). Performance modelling in forest operations through partial least square regression. Silva Fennica, 46(2), 241–252.
D’Angelo, C. A., Giuffrida, C., & Abramo, G. (2011). A heuristic approach to author name disambiguation in bibliometrics databases for large-scale research assessments. Journal of the American Society for Information Science and Technology, 62(2), 257–269.
Forina, M., Oliveri, P., Casale, M., & Lanteri, S. (2008a). Multivariate range modeling, a new technique for multivariate class modeling: The uncertainty of the estimates of sensitivity and specificity. Analytica Chimica Acta, 622(1), 85–93.
Forina, M., Oliveri, P., Lanteri, S., & Casale, M. (2008b). Class-modeling techniques, classic and new, for old and new problems. Chemometrics and Intelligent Laboratory Systems, 93(2), 132–148.
Garfield, E. (1979). Is citation analysis a legitimate evaluation tool? Scientometrics, 1(4), 359–375.
Glänzel, W. (2008). Seven myths in bibliometrics: About facts and fiction in quantitative science studies. In H. Kretschmer, & F. Havemann (Eds.), Proceedings of WIS 4th international conference on webometrics, informetrics and scientometrics, and 9th COLLNET meeting, Berlin, Germany.
Hall, G. J., & Kenny, J. E. (2007). Estuarine water classification using EEM spectroscopy and PARAFAC-SIMCA. Analytica Chimica Acta, 581(1), 118–124.
Kennard, R. W., & Stone, L. A. (1969). Computer aided design of experiments. Technometrics, 11(1), 137–148.
Krafft, C., Shapoval, L., Sobottka, S. B., Geiger K. D., Schackert G., & Salzer R. (2006). Identification of primary tumors of brain metastases by SIMCA classification of IR spectroscopic images. Biochimica et Biophysica Acta (BBA)-Biomembranes, 1758(7), 883–891.
Laudel, G., & Origgi, G. (2006). Introduction to a special issue on the assessment of interdisciplinary research. Research Evaluation, 15(1), 2–4.
Leydesdorff, L., & Bornmann, L. (2011). How fractional counting of citations affects the impact factor: Normalization in terms of differences in citation potentials among fields of science. Journal of the American Society for Information Science and Technology, 62(2), 217–229.
Luwel, M., & Moed, H. F. (1998). Publication delays in the science field and their relationship to the ageing of scientific literature. Scientometrics, 41(1–2), 29–40.
Menesatti, P., Antonucci, F., Pallottino, F., Bucarelli, F. M., & Costa, C. (2014). Spectrophotometric qualification of Italian pasta produced by traditional or industrial production parameters. Food and Bioprocess Technology, 7(5), 1364–1370.
Moed, H. F. (2005). Citation analysis in research evaluation. Springer, ISBN: 978-1-4020-3713-9.
Pepe, A., & Kurtz, M. J. (2012). A Measure of total research impact independent of time and discipline. PLoS One, 7(11), e46428.
Pratelli, L., Baccini, A., Barabesi, L., & Marcheselli, M. (2012). Statistical analysis of the Hirsch index. Scandinavian Journal of Statistics, 39, 681–694.
Radicchi, F., & Castellano, C. (2013). Analysis of bibliometric indicators for individual scholars in a large data set. Scientometrics, 97(3), 627–637.
Radicchi, F., Fortunato, S., & Castellano, C. (2008). Universality of citation distributions: Toward an objective measure of scientific impact. Proceedings of the National Academy of Sciences of the United States of America, 105(45), 17268–17272.
Taiti, C., Costa, C., Menesatti, P., Comparini, D., Bazihizina, N., Azzarello, E., Masi, E., & Mancuso, S. (2014). Class-modeling approach to PTR-TOFMS data: a peppers case study. Journal of the Science of Food and Agriculture. doi:10.1002/jsfa.6761.
Todeschini, R. (2011). The j-index: A new bibliometric index and multivariate comparisons between other common indices. Scientometrics, 87, 621–639.
Trivedi, P. K. (1993). An analysis of publication lags in econometrics. Journal of Applied Econometrics, 8(1), 93–100.
van Raan, A. F. J. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.
van Raan, A. F. J. (2006). Statistical properties of bibliometric indicators: Research group indicator distributions and correlations. Journal of the American Society for Information Science and Technology, 57(3), 408–430.
van Raan, A. F. J. (2008). Bibliometric statistical properties of the 100 largest European research universities: Prevalent Scaling rules in the science system. Journal of the American Society for Information Science and Technology, 59(3), 461–475.
Vanden Branden, K., & Hubert, M. (2005). Robust classification in high dimensions based on the SIMCA method. Chemometrics and Intelligent Laboratory Systems, 79(1–2), 10–21.
Wold, S., & Sjostrom, M., (1977). SIMCA: A method for analyzing chemical data in terms of similarity and analogy. In B. R. Kowalski (Ed.), Chemometrics: Theory and application (pp. 243–282). Washington, DC: American Chemical Society Symposium Series 52.
Zhang, Z., Cheng, Y., & Liu, N. C. (2014). Comparison of the effect of mean-based method and z-score for field normalization of citations at the level of Web of Science subject categories. Scientometrics. doi:10.1007/s11192-014-1294-7.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Abramo, G., Costa, C. & D’Angelo, C.A. A multivariate stochastic model to assess research performance. Scientometrics 102, 1755–1772 (2015). https://doi.org/10.1007/s11192-014-1474-5
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-014-1474-5