Skip to main content
Log in

How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

In this paper an analysis of the presence and possibilities of altmetrics for bibliometric and performance analysis is carried out. Using the web based tool Impact Story, we collected metrics for 20,000 random publications from the Web of Science. We studied both the presence and distribution of altmetrics in the set of publications, across fields, document types and over publication years, as well as the extent to which altmetrics correlate with citation indicators. The main result of the study is that the altmetrics source that provides the most metrics is Mendeley, with metrics on readerships for 62.6 % of all the publications studied, other sources only provide marginal information. In terms of relation with citations, a moderate spearman correlation (r = 0.49) has been found between Mendeley readership counts and citation indicators. Other possibilities and limitations of these indicators are discussed and future research lines are outlined.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. Being immediately available compared to citations that take time to accumulate.

  2. Previously known as Total Impact, we use IS in this study to refer to Impact Story. For a review of tools for tracking scientific impact see Wouters and  Costas (2012).

  3. For a full list see http://impactstory.org/faq.

  4. A REpresentational State Transfer (REST)(ful) API (Application Programming Interface) used to make a request using GET (DOIs) and collect the required response from impact Story.

  5. In the previous study, the data collection was performed manually directly through the web interface of IS. Manually, IS allowed collecting altmetrics for 100 DOIs per search and maximum 2000 DOIs search per day in order to avoid swamping the limits of its API, for details see Zahedi, Costas & Wouters (2013).

  6. The additional functionality from the “proc groovy” which is a java development environment added to SAS (Statistical Analysis Systems) environment for parsing and reading the JSON format and returning the data as an object.

  7. From IS one DOI was missing. We also found that 301 DOIs were wrong in WoS (including extra characters that made them unmatchable, therefore excluded from the analysis). Also 61 original DOIs from WOS pointed to 134 different WOS publications (i.e. being duplicated DOIs). This means that 74 publications were duplicates. Given the fact that there was no systematic way to determine which one was the correct one (i.e. the one that actually received the altmetrics), we included all of them in the analysis with the same altmetrics score resulted in: 20,000 − 1 − 301 + 74 = 19,772 final publications. All in all, this process showed that only 1.8 % of the initial DOIs randomly selected had some problems, thus indicating that a DOI is a convenient publication identifier although not free of limitations (i.e. errors in DOI data entry, technical errors when resolving DOIs via API and also the existence of multiple publication identifiers in the data sources, resulted in some errors in the full collection of altmetrics for these publications).

  8. It means that publications without any metrics were left out of the analysis.

  9. This was the only PLOS paper captured by our sample.

  10. Non-citable document type corresponds to all WOS document types other than article, letter and review (e.g. book reviews, editorial materials, etc.).

  11. In Delicious, articles, non-citables, letters and review papers have the highest number of metrics orderly.

  12. Average metrics per publications calculated by dividing the total numbers of metrics from each data source by total number of publications in the sample. For example, in Mendeley, average number of readers per publication equals to 99,050/19,772 = ~5.

  13. In the previous study, we used the NOWT (Medium) with 14 subject fileds. For more details see: http://nowt.merit.unu.edu/docs/NOWT-WTI_2010.pdf.

  14. Here publications can belong to multiple subject categories.

  15. According to the Global Research Report by Mendeley (http://www.mendeley.com/global-research-report/#.UjwfTsanqgk), coverage of Mendeley in different subjects are as follows: the highest coverage are by publications from Biological Science & Medicine (31 %), followed by Physical Sciences and Maths (16 %), Engineering & Materials Science (13 %), Computer & Information Science (10 %), Psychology, Linguistics & Education(10 %), Business Administration, Economics & Operation Research (8 %), Law & Other Social Sciences (7 %) and Philosophy, Arts & Literature & other Humanities (5 %).

  16. For 9 fields (8 fields from Art and Humanities and 1 field from Science) CPP and RPP scores were exactly the same.

  17. In 2005, the two most tweeted papers are from the field of Physics, they received more than half of the total tweets in this year (472 tweets), thus showing a strong skewed distribution.

  18. Calculating Spearman correlation analysis in SPSS for large datasets gives this error: "Too many cases for the available storage", for overcoming this limitation, we followed the process we mentioned in the text. For more details see: http://www.ibm.com/support/docview.wss?uid=swg21476714.

  19. Impact Story, was in an initial stage of development (i.e. in a ‘Beta’ version) at the moment of development of this study.

  20. For current limitations of IS see: http://impactstory.org/faq#toc_3_11.

  21. The time interval between the first and the second data collection was 6 months and data collection done manually versus the second one which done automatically using RESTAPI calls.

  22. Reasons for these differences can be the changes/improvements in the identification of publications by Mendeley (e.g. by merging version of the same paper, identifying more DOIs, increments in the number of users in Mendeley, etc).

References

  • Archambault, É., & Larivière, V. (2006). The limits of bibliometrics for the analysis of the social sciences and humanities literature, International Social Science Council: World social sciences report 2010: Knowledge divides (pp. 251–254). Paris: UNESCO.

    Google Scholar 

  • Armbruster, C. (2007). Access, usage and citation metrics: what function for digital libraries and repositories in research evaluation? Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1088453.

  • Bar-Ilan, J. (2012). JASIST@ Mendeley. Presented at ACM Web Science Conference Workshop on Altmetrics, Evanston, IL. Retrieved June 21, 2012 from http://altmetrics.org/altmetrics12/bar-ilan/.

  • Bar-Ilan, J., Haustein, S., Peters, I., Priem, J., Shema, H., & Terliesner, J. (2012). Beyond citations: Scholars’ visibility on the social Web. In Proceedings of the 17th International Conference on Science and Technology Indicators, Montreal, Quebec. Retrieved from http://arxiv.org/abs/1205.5611/.

  • Benos, D. J., Bashari, E., Chaves, J. M., et al. (2007). The ups and downs of peer review. Advances in Physiology Education, 31, 145–152.

    Article  Google Scholar 

  • Blecic, D. (1999). Measurements of journal use: An analysis of the correlations between three methods. Bulletin of the Medical Library Association, 87, 20–25.

    Google Scholar 

  • Bollen, J., Van de Sompel, H., Hagberg, A., & Chute, R. (2009). A principal component analysis of 39 scientific impact measures. PLoS ONE, 4(6), e6022. doi:10.1371/journal.pone.0006022.

    Article  Google Scholar 

  • Bollen, J., Van de Sompel, H., & Rodriguez, M. A. (2008). Towards usage-based impact metrics. In Proceedings of the 8th ACM/IEEE-CS Joint Conference on Digital libraries (JCDL), New York, USA. Retrieved from http://arxiv.org/pdf/0804.3791v1.pdf.

  • Bordons, M., Fernandez, M. T., & Gomez, I. (2002). Advantages and limitations in the use of impact factor measures for the assessment of research performance. Scientometrics, 53, 195–206.

    Article  Google Scholar 

  • Bornmann, L. (2013). Is there currently a scientific revolution in Scientometrics? Journal of the American Society for Information Science & Technology. Retrieved from www.lutz-bornmann.de/icons/impactrevolution.pdf.

  • Bornmann, L., & Leydesdorff, L. (2013). The validation of (advanced) bibliometric indicators through peer assessments: a comparative study using data from InCites and F1000. Journal of Informetrics, 7(2), 286–291.

    Article  Google Scholar 

  • Brody, T., Harnad, S., & Carr, L. (2006). Earlier Web usage statistics as predictors of later citation impact. Journal of the American Society for Information Science, 57, 1060–1072.

    Article  Google Scholar 

  • Butler, L., & McAllister, I. (2011). Evaluating university research performance using metrics. European Political Science, 10(1), 44–58.

    Article  Google Scholar 

  • Davis, P. M. (2012). Tweets, and our obsession with alt metrics. The Scholarly Kitchen. Retrieved from http://scholarlykitchen.sspnet.org/2012/01/04/tweets-and-our-obsession-with-alt-metrics/.

  • Duy, J., & Vaughan, L. (2006). Can electronic journal usage data replace citation data as a measure of journal use? An empirical examination. The Journal of Academic Librarianship, 32(5), 512–517.

    Article  Google Scholar 

  • Eysenbach, G. (2011). Can tweets predict citations? Metrics of social impact based on twitter and correlation with traditional metrics of scientific impact. Journal of Medical Internet Research, 13(4), e123.

    Article  Google Scholar 

  • Forta, B. (2008). Sams teach yourself SQL in 10 minutes. USA: Sams Publishing.

    Google Scholar 

  • Galligan, F., & Dyas-Correia, S. (2013). Altmetrics: Rethinking the way we measure. Serials Review, 39(1), 56–61.

    Article  Google Scholar 

  • Haustein, S. (2010). Multidimensional journal evaluation. In Proceedings of the 11th International Conference on Science and Technology Indicators (pp. 120–122), Leiden, the Netherlands.

  • Haustein, S., Peters, I., Bar-Ilan, J., Priem, J., Shema, H., & Terliesner, J. (2013). Coverage and adoption of altmetrics sources in the bibliometric community. In 14th International Society of Scientometrics and Informetrics Conference (pp. 1–12). Digital Libraries. Retrieved from http://arxiv.org/abs/1304.7300.

  • Haustein, S., & Siebenlist, T. (2011). Applying social bookmarking data to evaluate journal usage. Journal of Informetrics, 5, 446–457.

    Google Scholar 

  • Henning, V. (2010). The top 10 journal articles published in 2009 by readership on Mendeley. Retrieved from http://www.mendeley.com/blog/academic-features/the-top-10-journal-articles-published-in-2009-by-readership-on-mendeley/.

  • Hicks, D. & Melkers, J. (2012). Bibliometrics as a Tool for Research Evaluation. In Al Link & Nick Vornatas (Eds.) Handbook on the Theory and Practice of Program Evaluation. Edward Elgar. Retrieved from http://works.bepress.com/diana_hicks/31.

  • Li, X., & Thelwall, M. (2012). F1000, Mendeley and traditional bibliometric indicators. Proceedings of the 17th International Conference on Science and Technology Indicators (pp. 451–551). Canada: Montréal.

    Google Scholar 

  • Li, X., Thelwall, M., & Giustini, D. (2012). Validating online reference managers for scholarly impact measurement. Scientometrics, 91(2), 461–471.

    Article  Google Scholar 

  • Lopez-Cozar, E. D., Robinson-Garcia, N., & Torres Salinas, D. (2012). Manipulating google scholar citations and google scholar metrics: simple, easy and tempting. Retrieved from http://arxiv.org/abs/1212.0638.

  • MacRoberts, M. H., & MacRoberts, B. R. (1989). Problems of citation analysis: A critical review. Journal of the American Society for Information Science, 40, 342–349.

    Article  Google Scholar 

  • Martin, B. R., & Irvine, J. (1983). Assessing basic research: Some partial indicators of scientific progress in radio astronomy. Research Policy, 12, 61–90.

    Article  Google Scholar 

  • Moed, H. F. (2005). Citation analysis in research evaluation. Berlin: Springer.

    Google Scholar 

  • Moed, H. F. (2007). The future of research evaluation rests with an intelligent combination of advanced metrics and transparent peer review. Science and Public Policy, 34(8), 575–583.

    Article  Google Scholar 

  • Moed, H. F. (2009). New developments in the use of citation analysis in research evaluation. Archivum immunologiae et therapiae experimentalis, 57(1), 13–18.

    Article  Google Scholar 

  • Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66(1), 81–100.

    Article  MathSciNet  Google Scholar 

  • Nederhof, A. J., & Van Raan, A. F. J. (1987). Peer review and bibliometric indicators of scientific performance: a comparison of cum laude doctorates with ordinary doctorates in physics. Scientometrics, 11(5–6), 333–350.

    Article  Google Scholar 

  • Nicolaisen, J. (2007). Citation analysis. Annual Review of Information Science and Technology, 41, 609–641.

    Article  Google Scholar 

  • Priem, J., Hemminger, B. H. (2010). Scientometrics 2.0: Toward new metrics of scholarly impact on the social Web. Retrieved First Monday 15, from http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2874/2570.

  • Priem, J., Piwowar, H., & Hemminger, B. H. (2012). Altmetrics in the wild: Using social media to explore scholarly impact. ArXiv: 1203.4745v1.

  • Priem, J., Taraborelli, D., Groth, P., and Neylon, C. (2010). Altmetrics: a manifesto. Retrieved from http://altmetrics.org/manifesto/.

  • Rousseau, R., & Ye, F. (2013). A multi-metric approach for research evaluation. Chinese Science Bulletin, 10–12.

  • Rowlands, I., & Nicholas, D. (2007). The missing link: Journal usage metrics. Aslib Proceedings, 59(3), 222–228.

    Article  Google Scholar 

  • Schlögl, C., Gorraiz, J., Gumpenberger, C., Jack, K., & Kraker, P. (2013). Download vs. citation vs. readership data: the case of an information systems journals. In J. Gorraiz, E. Schiebel, C. Gumpenberger, M. Hörlesberger, & H. Moed (Eds.), Proceedings of the 14th International Society of Scientometrics and Informetrics Conference, Vienna, Austria (pp. 626-634). Wien: Facultas Verlags und Buchhandels AG.

  • Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. British Medical Journal, 314–497.

  • Shema, H., Bar-Ilan, J., & Thelwall, M. (2013). Do blog citations correlate with a higher number of future citations ? Research blogs as a potential source for alternative metrics. In J. Gorraiz, E. Schiebel, C. Gumpenberger, M. Hörlesberger, & H. Moed (Eds.), Proceedings of the 14th International Society of Scientometrics and Informetrics Conference, Vienna, Austria (pp. 604–611). Wien: Facultas Verlags und Buchhandels AG.

    Google Scholar 

  • Shuai, X., Pepe, A., & Bollen., J. (2012). How the scientific community reacts to newly submitted preprints: Article downloads Twitter mentions, and citations. Retrieved from http://arxiv.org/abs/1202.2461v1.

  • Smith, A. G. (1999). A tale of two web spaces; comparing sites using web impact factors. Journal of Documentation, 55(5), 577–592.

    Google Scholar 

  • Taylor, J. (2011). The assessment of research quality in UK. Universities: peer review or metrics? British Journal of Management, 22(2), 202–217.

    Article  Google Scholar 

  • Thelwall, M. (2001). Extracting macroscopic information from web links. Journal of American Society for Information Science and Technology, 52(13), 1157–1168.

    Article  Google Scholar 

  • Thelwall, M. (2004). Weak benchmarking indicators for formative and semi-evaluative assessment of research. Research Evaluation, 13(1), 63–68.

    Article  Google Scholar 

  • Thelwall, M. (2008). Bibliometrics to Webometrics. Journal of Information Science, 34(4), 605–621.

    Article  Google Scholar 

  • Thelwall, M. (2012a). A history of webometrics. Bulletin of the American Society for Information Science and Technology, 38(6), 18–23.

    Google Scholar 

  • Thelwall, M. (2012b). Journal impact evaluation: a webometric perspective. Scientometrics, 92(2), 429–441.

    Article  Google Scholar 

  • Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). Do altmetrics work? Twitter and ten other social web services. PLoS ONE, 8(5), e64841.

    Article  Google Scholar 

  • Torres-Salinas, D., Cabezas-Clavijo, A., & Jimenez-Contreras, E. (2013a). Altmetrics: New indicators for scientific communication in web 2.0. Comunicar. Retrieved from http://arxiv.org/ftp/arxiv/papers/1306/1306.6595.pdf.

  • Torres-Salinas, D., Robinson-Garcia, N., Campanario, J. M., & López-Cózar, E. D. (2013b). Coverage, field specialisation and the impact of scientific publishers indexed in the book citation index. Online Information Review, 38(1), 24–42.

    Article  Google Scholar 

  • Van Raan, A. F. J., Van Leeuwen, T. N., & Visser, M. S. (2011). Severe language effect in university rankings: particularly Germany and France are wronged in citation-based rankings. Scientometrics, 88(2), 495–498.

    Article  Google Scholar 

  • Vaughan, L., & Shaw, D. (2003). Bibliographic and web citations: what is the difference? Journal of the American Society for Information Science and Technology, 54(14), 1313–1322.

    Article  Google Scholar 

  • Waltman, L., & Costas, R. (2013). F1000 Recommendations as a potential new data source for research evaluation: a comparison with citations. Journal of the Association for Information Science and Technology. doi: 10.1002/asi.23040.

  • Waltman, L., Van Eck, N. J., Van Leeuwen, T. N., Visser, M. S., & Van Raan, A. F. J. (2011). Towards a new crown indicator: Some theoretical considerations. Journal of Informetrics, 5(1), 37–47.

    Article  Google Scholar 

  • Wouters, P. (1999). The Citation Culture, Ph.D. Thesis, University of Amsterdam.

  • Wouters, P., Costas, R. (2012). Users, narcissism and control: Tracking the impact of scholarly publications in the 21st century. Utrecht: SURF foundation. Retrieved from http://www.surffoundation.nl/nl/publicaties/Documents/Users%20narcissism%20and%20control.pdf.

  • Zahedi, Z., Costas, R., & Wouters, P. (2013). How well developed are Altmetrics? Cross disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications (RIP). In J. Gorraiz, E. Schiebel, C. Gumpenberger, M. Hörlesberger, & H. Moed (Eds.), Proceedings of the 14th International Society of Scientometrics and Informetrics Conference, Vienna, Austria (pp. 876–884). Wien: Facultas Verlags und Buchhandels AG.

    Google Scholar 

  • Zhang, Y. (2012). Comparison of select reference management tools. Medical Reference Services Quarterly, 31(1), 45–60.

    Article  Google Scholar 

Download references

Acknowledgments

This study is the extended version of our research in progress paper (RIP) presented at the 14th International Society of Scientometrics & Informetrics Conference (ISSI) Conference, 15-19 July, 2013, Vienna, Austria. We thank the IS team for their support in working with the Impact Story API. This work is partially supported by the EU FP7 ACUMEN project (Grant agreement: 266632). The authors would like to thank Erik Van Wijk from CWTS for his great help in managing altmetrics data. The authors also acknowledge the useful suggestions of Ludo Waltman from CWTS and the fruitful comments of the anonymous referees of the journal.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zohreh Zahedi.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Zahedi, Z., Costas, R. & Wouters, P. How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications. Scientometrics 101, 1491–1513 (2014). https://doi.org/10.1007/s11192-014-1264-0

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-014-1264-0

Keywords

Navigation