Herald of the Russian Academy of Sciences

, Volume 88, Issue 5, pp 394–400 | Cite as

An Altmetric As an Indicator of a Publication’s Scientific Impact

  • V. A. MarkusovaEmail author
  • L. E. MindeliEmail author
  • V. G. BogorovEmail author
  • A. N. LibkindEmail author
From the Researcher’s Notebook


The results of an empirical pilot project focused on the association between classical bibliometrics—publication, citation index, and journal cited half-life—and an altmetric—the assessment of an article’s impact—are discussed. The analysis included an array of 37 200 domestic articles indexed in SCI-E in 2015. Two altmetrics are used: usage counts for the last 180 days, U1, and usage counts since February 1, 2013, U2. A significant Kendall rank correlation has been identified between citation indices and article-level metrics. A stronger correlation has been observed for long usage counts, U2. The relationship between usage metrics and traditional journal-level metrics (cited half-life) has been analyzed. A rather weak negative correlation between cited half-life and U1 (U2) has been revealed, which is described by an inverse logarithmic dependence. In the authors’ opinion, altmetrics should not be opposed to classical bibliometrics; they should be used as additional metrics to assess an article’s impact.


altmetrics indicators citation index article usage indicator publications journal cited half-life journal Kendall rank correlation 



This article was supported in part by the Russian Foundation for Basic Research, project nos. 17-02-00157 “Comparative Analysis of the Dynamics of Domestic and Global Natural Scientific Priorities over the Entire Post-Soviet Period, Including Assessment of the Russian Diaspora’s Participation in the Creation of Modern Emerging Technologies for Use in the Domestic Economy (Web of Science)” and 17-02-00078 “Documentary Scientific Communications, Their Role in the Transfer, Dissemination, and Preservation of Domestic Research Results in Social Sciences and the Humanities as Compared to the BRICS and EU Countries, the United States, and Canada (according to the Web of Science over 1986–2015) and the Assessment of the Impact of These Results on Decisions Made in the Field of Socioeconomic Policy” and in part by the RAS Presidium basic research program “Scientific Fundamentals of the Development of the Russian Scientific Innovative Complex in the Context of Global Transformations” (coordinator V.V. Ivanov).


  1. 1.
    J. Priem, P. Groth, and D. Taraborelli, “The altmetrics collection,” PLOS 7 (11), e48753 (2010).CrossRefGoogle Scholar
  2. 2.
    J. Priem, D. Taraborelli, P. Groth, and C. Neylon, Altmetrics: A manifesto, Oct. 26 (2010). Cited March 20, 2018.Google Scholar
  3. 3.
    C. L. Gonzalez-Valiente, J. Pacheco-Mendoza, and R. Arencibia-Jorge, “A review of altmetrics as an emerging discipline for research evaluation,” Learned Publishing 29 (4), 229–238 (2016).CrossRefGoogle Scholar
  4. 4.
    S. Haustein, I. Peters, J. Bar-Ilan, et al., “Coverage and adoption of altmetrics sources in the bibliometric community,” Scientometrics 101 (2), 1145–1163 (2014).CrossRefGoogle Scholar
  5. 5.
    X. Wang, F. Zhichao, and X. Sun, “Usage patterns of scholarly articles on Web of Science,” Scientometrics 109 (2), 917–926 (2016).CrossRefGoogle Scholar
  6. 6.
    C. Gumpenberger, W. Glänzel, and J. Gorraiz, “The ecstasy and the agony of the altmetrics score,” Scientometrics 108 (2), 977–982 (2016).CrossRefGoogle Scholar
  7. 7.
    P. Chi and W. Glänzel, “An empirical investigation of the associations among usage, scientific collaboration and citation impact,” Scientometrics 112 (7), 403–412 (2017).CrossRefGoogle Scholar
  8. 8.
    W. Glänzel, “Characteristic scores and scales. A bibliometric analysis of subject characteristics based on long-term citation observation,” J. Informatics 1 (1), 92–102 (2007).Google Scholar
  9. 9.
    P. Chi and W. Glänzel, “Impact and usage indicators for the assessment of research in scientific disciplines and journals,” Scientometrics (in press).Google Scholar
  10. 10.
    G. Lewison and V. Markusova, “The evaluation of Russian cancer research,” Res. Evaluation 19 (2), 129–144 (2010).CrossRefGoogle Scholar
  11. 11.
    M. Karaulova, G. Abdullah, O. Shackleton, and P. Shapira, “Science system pass-dependencies and their influences: Nanotechnology research in Russia,” Scientometrics 100 (3), 365–383 (2016).Google Scholar
  12. 12.
    V. A. Markusova, “Information resources to monitor Russian science,” Vestn. Ross. Akad. Nauk 75 (7), 607–612 (2005).Google Scholar

Copyright information

© Pleiades Publishing, Ltd. 2018

Authors and Affiliations

  1. 1.All-Russia Institute for Scientific and Technical Information, Russian Academy of SciencesMoscowRussia
  2. 2.Institute for the Study of Science, Russian Academy of SciencesMoscowRussia
  3. 3.Clarivate AnalyticsPhiladelphiaUnited States

Personalised recommendations