Exploring alternative metrics of scholarly performance in the social sciences and humanities in Taiwan
- 1.1k Downloads
Research output and impact metrics derived from commercial citation databases such as Web of Science and Scopus have become the de facto indicators of scholarly performance across different disciplines and regions. However, it has been pointed out that the existing metrics are largely inadequate to reflect scholars’ overall peer-mediated performance, especially in the social sciences and humanities (SSH) where publication channels are more diverse. In this paper alternative metrics exploring a variety of formal and informal communication channels were proposed, with the aim of better reflecting SSH scholarship. Data for a group of SSH scholars in Taiwan on these metrics were collected. Principal component analysis revealed four underlying dimensions represented by the 18 metrics. Multiple-regression analyses were then performed to examine how well each of these dimensions predicted the academic standing of the scholars, measured by the number of public grants awarded and prestigious research awards received. Differences in the significance of the predictors were found between the social sciences and humanities. The results suggest the need to consider disciplinary differences when evaluating scholarly performance.
KeywordsResearch evaluation Bibliometrics Evaluation metrics Altmetrics
The study was sponsored by “The Aim for the Top University Project, Integrated Platform of Digital Humanities” at National Taiwan University in Taiwan.
- Allen, L., Jones, C., Dolby, K., Lynn, D., & Walport, M. (2009). Looking for landmarks: the role of expert review and bibliometric analysis in evaluating scientific publication outputs. PLoS One, 4(6), e5910.Google Scholar
- Bar-Ilan, J., Haustein, S., Peters, I., Priem, J., Shema, H., & Terliesner, J. (2012). Beyond citations: Scholars’ visibility on the social Web. arXiv preprint arXiv:1205.5611.Google Scholar
- Bornmann, L., & Leydesdorff, L. (2013). The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000. Journal of Informetrics, 7(2), 286–291.Google Scholar
- Clemens, E. S., Powell, W. W., McIlwaine, K., & Okamoto, D. (1995). Careers in print: Books, journals, and scholarly reputations. American Journal of Sociology, 433-494.Google Scholar
- Haustein, S., & Siebenlist, T. (2011). Applying social bookmarking data to evaluate journal usage. Journal of Informetrics, 5(3), 446–457.Google Scholar
- Nederhof, A., & Erlings, C. (1993). A bibliometric study of productivity and impact of modern language and literature research in the Netherlands, 1982-1991. Leiden: report CWTS-93-09.Google Scholar
- Nederhof, A. J., Zwaan, R. A., De Bruin, R. E., & Dekker, P. J. (1988). Accessing the useful of bibliometric indicators for the humanities and the social and behavioral sciences: A comparative study. Scientometrics, 15(5–6), 423–435.Google Scholar
- Nielsen, F. Å. (2007). Scientific citations in Wikipedia. First Monday, 12(8).Google Scholar
- Priem, J. & Hemminger, B. (2010). Scientometrics 2.0: Toward new metrics of scholarly impact on the social web. First Monday, 15(7).Google Scholar
- Priem, J., Piwowar, H., & Hemminger, B. (2011). Altmetrics in the wild: An exploratory study of impact metrics based on social media. Presented at Metrics 2011: Symposium on Informetric and Scientometric Research. New Orleans, LA, USA, October 12.Google Scholar
- Priem, J., Taraborelli, D., Groth, P. & Neylon, C. (2010). Altmetrics: A manifesto. Rerieved from: http://altmetrics.org/manifesto/.
- Rousseau, R. & Ye, F. (2013). A multi-metrics approach for research evaluation. Chinese Science Bulletin, 10-12.Google Scholar
- Shema, H., Bar‐Ilan, J., & Thelwall, M. (2014). Do blog citations correlate with a higher number of future citations? Research blogs as a potential source for alternative metrics. Journal of the Association for Information Science and Technology, 65(5), 1018–1027.Google Scholar
- Waltman, L., & Costas, R. (2014). F1000 Recommendations as a potential new data source for research evaluation: A comparison with citations. Journal of the Association for Information Science and Technology, 65(3), 433–445.Google Scholar