Advertisement

Do altmetrics work for assessing research quality?

  • Andrea Giovanni Nuzzolese
  • Paolo Ciancarini
  • Aldo Gangemi
  • Silvio Peroni
  • Francesco Poggi
  • Valentina Presutti
Article

Abstract

Alternative metrics (aka altmetrics) are gaining increasing interest in the scientometrics community as they can capture both the volume and quality of attention that a research work receives online. Nevertheless, there is limited knowledge about their effectiveness as a mean for measuring the impact of research if compared to traditional citation-based indicators. This work aims at rigorously investigating if any correlation exists among indicators, either traditional (i.e. citation count and h-index) or alternative (i.e. altmetrics) and which of them may be effective for evaluating scholars. The study is based on the analysis of real data coming from the National Scientific Qualification procedure held in Italy by committees of peers on behalf of the Italian Ministry of Education, Universities and Research.

Keywords

Altmetrics Research quality Bibliometric indicators Correlation analysis 

Notes

Acknowledgements

This research has been supported by the Italian National Agency for the Evaluation of the University and Research Systems (ANVUR) within the Measuring the Impact of Research - Alternative indicators (MIRA) project. Andrea Giovanni Nuzzolese is the main contributor of this paper and the principal investigator of the project that supported the research presented in this paper.

References

  1. Bar-Ilan, J. (2012). JASIST 2001–2010. Bulletin of the Association for Information Science and Technology, 38(6), 24–28.  https://doi.org/10.1002/bult.2012.1720380607.CrossRefGoogle Scholar
  2. Bornmann, L. (2015). Alternative metrics in scientometrics: A meta-analysis of research into three altmetrics. Scientometrics, 103(3), 1123–1144.  https://doi.org/10.1007/s11192-015-1565-y.CrossRefGoogle Scholar
  3. Bornmann, L., & Haunschild, R. (2018). Do altmetrics correlate with the quality of papers? A large-scale empirical study based on F1000Prime data. PloS One, 13(5), 0197133.  https://doi.org/10.1371/journal.pone.0197133.CrossRefGoogle Scholar
  4. Brody, T., Harnad, S., & Carr, L. (2006). Earlier Web usage statistics as predictors of later citation impact. Journal of the American Society for Information Science and Technology, 57(8), 1060–1072.  https://doi.org/10.1002/asi.20373.CrossRefGoogle Scholar
  5. Costas, R., Zahedi, Z., & Wouters, P. (2015). Do altmetrics correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. Journal of the Association for Information Science and Technology, 66(10), 2003–2019.  https://doi.org/10.1002/asi.23309.CrossRefGoogle Scholar
  6. de Beaver, D. B., & Rosen, R. (1978). Studies in scientific collaboration–Part I. The professional origins of scientific co-authorship. Scientometrics, 1(1), 65–84.  https://doi.org/10.1007/BF02016840.CrossRefGoogle Scholar
  7. Ibáñez, A., Larrañaga, P., & Bielza, C. (2011). Predicting the h-index with cost-sensitive Näive Bayes. In 11th international conference on intelligent systems design and applications (ISDA) 2011, IEEE (pp. 599–604). IEEE.  https://doi.org/10.1109/ISDA.2011.6121721
  8. Jensen, P., Rouquier, J.-B., & Croissant, Y. (2009). Testing bibliometric indicators by their prediction of scientists promotions. Scientometrics, 78(3), 467–479.  https://doi.org/10.1007/s11192-007-2014-3.CrossRefGoogle Scholar
  9. Kousha, K., & Thelwall, M. (2007). Google Scholar citations and Google Web/URL citations: A multi-discipline exploratory analysis. Journal of the American Society for Information Science and Technology, 58(7), 1055–1065.  https://doi.org/10.1002/asi.20584.CrossRefGoogle Scholar
  10. Kousha, K., & Thelwall, M. (2008). Assessing the impact of disciplinary research on teaching: An automatic analysis of online syllabuses. Journal of the Association for Information Science and Technology, 59(13), 2060–2069.  https://doi.org/10.1002/asi.20920.CrossRefGoogle Scholar
  11. Kousha, K., & Thelwall, M. (2009). Google book search: Citation analysis for social science and the humanities. Journal of the American Society for Information Science and Technology, 60(8), 1537–1549.  https://doi.org/10.1002/asi.21085.CrossRefGoogle Scholar
  12. Li, X. & Thelwall, M. (2012) F1000, mendeley and traditional bibliometric indicators. In Archambault, Y. G., & Lariviere, V. (eds.) The 17th international conference on science and technology indicators (pp. 541–551).Google Scholar
  13. Li, X., Thelwall, M., & Giustini, D. (2012). Validating online reference managers for scholarly impact measurement. Scientometrics, 91(2), 461–471.  https://doi.org/10.1007/s11192-011-0580-x.CrossRefGoogle Scholar
  14. Meho, L. I., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105–2125.  https://doi.org/10.1002/asi.20677.CrossRefGoogle Scholar
  15. Nuzzolese, A. G., Presutti, V., Gangemi, A. & Ciancarini, P. (2018) Extending ScholarlyData with research impact indicators. In Workshop on semantics, analytics, visualisation: Enhancing scholarly dissemination (SAVE-SD), Springer.Google Scholar
  16. Ortega, J. L. (2018) Reliability and accuracy of altmetric providers: A comparison among Altmetric.com, PlumX and Crossref Event Data. Scientometrics.  https://doi.org/10.1007/s11192-018-2838-z
  17. Peters, I., Jobmann, A., Hoffmann, C. P., Künne, S., Schmitz, J., & Wollnik-Korn, G. (2014). Altmetrics for large, multidisciplinary research groups: Comparison of current tools. Bibliometrie-praxis und forschung, 3, 1–19.  https://doi.org/10.5283/bpf.205.CrossRefGoogle Scholar
  18. Pinkowitz, L. (2002). Research dissemination and impact: Evidence from web site downloads. The Journal of Finance, 57(1), 485–499.CrossRefGoogle Scholar
  19. Priem, J., Groth, P., & Taraborelli, D. (2012). The altmetrics collection. PloS One, 7(11), 48753.  https://doi.org/10.1371/journal.pone.0048753.CrossRefGoogle Scholar
  20. Ravenscroft, J., Liakata, M., Clare, A., & Duma, D. (2017). Measuring scientific impact beyond academia: An assessment of existing impact metrics and proposed improvements. PloS One, 12(3), 0173152.  https://doi.org/10.1371/journal.pone.0173152.CrossRefGoogle Scholar
  21. Thelwall, M. (2018). Early Mendeley readers correlate with later citation counts. Scientometrics, 115(3), 1231–1240.  https://doi.org/10.1007/s11192-018-2715-9.CrossRefGoogle Scholar
  22. Thelwall, M., & Kousha, K. (2008). Online presentations as a source of scientific impact? An analysis of PowerPoint files citing academic journals. Journal of the Association for Information Science and Technology, 59(5), 805–815.  https://doi.org/10.1002/asi.20803.CrossRefGoogle Scholar
  23. Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). Do altmetrics work? Twitter and ten other social web services. PloS One, 8(5), 64841.  https://doi.org/10.1371/journal.pone.0064841.CrossRefGoogle Scholar
  24. Vieira, E. S., Cabral, J. A., & Gomes, J. A. (2014). Definition of a model based on bibliometric indicators for assessing applicants to academic positions. Journal of the Association for Information Science and Technology, 65(3), 560–577.  https://doi.org/10.1002/asi.22981.CrossRefGoogle Scholar
  25. Wouters, P., Thelwall, M., Kousha, K., Waltman, L., de Rijcke, S., Rushforth, A. & Franssen, T. (2015). The metric tide: Correlation analysis of REF2014 scores and metrics (Supplementary Report II to the Independent Review of the Role of Metrics in Research Assessment and Management). London: Higher Education Funding Council for England (HEFCE).  https://doi.org/10.13140/RG.2.1.3362.4162

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2019

Authors and Affiliations

  1. 1.STLabISTC-CNRRomeItaly
  2. 2.DISIUniversity of BolognaBolognaItaly
  3. 3.FICLITUniversity of BolognaBolognaItaly

Personalised recommendations