Citation analysis with microsoft academic
- 1.5k Downloads
We explore if and how Microsoft Academic (MA) could be used for bibliometric analyses. First, we examine the Academic Knowledge API (AK API), an interface to access MA data, and compare it to Google Scholar (GS). Second, we perform a comparative citation analysis of researchers by normalizing data from MA and Scopus. We find that MA offers structured and rich metadata, which facilitates data retrieval, handling and processing. In addition, the AK API allows retrieving frequency distributions of citations. We consider these features to be a major advantage of MA over GS. However, we identify four main limitations regarding the available metadata. First, MA does not provide the document type of a publication. Second, the “fields of study” are dynamic, too specific and field hierarchies are incoherent. Third, some publications are assigned to incorrect years. Fourth, the metadata of some publications did not include all authors. Nevertheless, we show that an average-based indicator (i.e. the journal normalized citation score; JNCS) as well as a distribution-based indicator (i.e. percentile rank classes; PR classes) can be calculated with relative ease using MA. Hence, normalization of citation counts is feasible with MA. The citation analyses in MA and Scopus yield uniform results. The JNCS and the PR classes are similar in both databases, and, as a consequence, the evaluation of the researchers’ publication impact is congruent in MA and Scopus. Given the fast development in the last year, we postulate that MA has the potential to be used for full-fledged bibliometric analyses.
KeywordsNormalization Citation analysis Percentiles Microsoft Academic Google Scholar Scopus
- Bornmann, L., Thor, A., Marx, W., & Schier, H. (2016). The application of bibliometrics to research evaluation in the humanities and social sciences: An exploratory study using normalized Google Scholar data for the publications of a research institute. Journal of the Association for Information Science and Technology, 67(11), 2778–2789. doi: 10.1002/asi.23627.CrossRefGoogle Scholar
- De Domenico, M., Omodei, E., & Arenas, A. (2016). Quantifying the diaspora of knowledge in the last century. arXiv:1604.00696v1.
- Harzing, A. W. (2007). Publish or perish. Available from http://www.harzing.com/pop.htm.
- Rehn, C., Wadskog, D., Gornitzki, C., & Larsson, A. (2014). Bibliometric indicators—definitions and usage at Karolinska Institutet. Stockholm: Karolinska Institutet University Library.Google Scholar
- Ribas, S., Ueda, A., Santos, R. L. T., Ribeiro-Neto, B., & Ziviani, N. (2016). Simplified Relative Citation Ratio for static paper ranking. arXiv:1603.01336v1.
- Sinha, A., Shen, Z., Song, Y., Ma, H., Eide, D., Hsu, B., & Wang, K. (2015). An overview of Microsoft Academic Service (MAS) and applications. Paper presented at the Proceedings of the 24th International Conference on World Wide Web (WWW’15). Retrieved from http://research.microsoft.com/apps/pubs/default.aspx?id=246609.
- Wade, A., Kuasan, W., Yizhou, S., & Gulli, A. (2016). WSDM cup 2016: Entity ranking challenge. In P. N. Bennet, V. Josifovski, J. Neville, & F. Radlinski (Eds.), Proceedings of the Ninth ACM International Conference on Web Search and Data Mining (pp. 593–594). New York, NY: Association for Computing Machinery.CrossRefGoogle Scholar
- Wouters, P., & Costas, R. (2012). Users, narcissism and control—tracking the impact of scholarly publications in the 21st century. In E. Archambault, Y. Gingras, & V. Larivière (Eds.), Proceedings of the 17th International Conference on Science and Technology Indicators. Sciene-Metrix and OST: Montréal.Google Scholar