, Volume 102, Issue 1, pp 97–112 | Cite as

Exploring alternative metrics of scholarly performance in the social sciences and humanities in Taiwan

  • Kuang-hua Chen
  • Muh-chyun Tang
  • Chun-mei Wang
  • Jieh Hsiang


Research output and impact metrics derived from commercial citation databases such as Web of Science and Scopus have become the de facto indicators of scholarly performance across different disciplines and regions. However, it has been pointed out that the existing metrics are largely inadequate to reflect scholars’ overall peer-mediated performance, especially in the social sciences and humanities (SSH) where publication channels are more diverse. In this paper alternative metrics exploring a variety of formal and informal communication channels were proposed, with the aim of better reflecting SSH scholarship. Data for a group of SSH scholars in Taiwan on these metrics were collected. Principal component analysis revealed four underlying dimensions represented by the 18 metrics. Multiple-regression analyses were then performed to examine how well each of these dimensions predicted the academic standing of the scholars, measured by the number of public grants awarded and prestigious research awards received. Differences in the significance of the predictors were found between the social sciences and humanities. The results suggest the need to consider disciplinary differences when evaluating scholarly performance.


Research evaluation Bibliometrics Evaluation metrics Altmetrics 



The study was sponsored by “The Aim for the Top University Project, Integrated Platform of Digital Humanities” at National Taiwan University in Taiwan.


  1. Aguillo, I. F., Granadino, B., Ortega, J. L., & Prieto, J. A. (2006). Scientific research activity and communication measured with cybermetrics indicators. Journal of the American Society for Information Science and Technology, 57(10), 1296–1302.CrossRefGoogle Scholar
  2. Aguillo, I. F., Ortega, J. L., & Fernández, M. (2008). Webometric ranking of world universities: Introduction, methodology, and future developments. Higher Education in Europe, 33(2–3), 233–244.CrossRefGoogle Scholar
  3. Allen, L., Jones, C., Dolby, K., Lynn, D., & Walport, M. (2009). Looking for landmarks: the role of expert review and bibliometric analysis in evaluating scientific publication outputs. PLoS One, 4(6), e5910.Google Scholar
  4. Bar-Ilan, J., Haustein, S., Peters, I., Priem, J., Shema, H., & Terliesner, J. (2012). Beyond citations: Scholars’ visibility on the social Web. arXiv preprint arXiv:1205.5611.Google Scholar
  5. Björneborn, L., & Ingwersen, P. (2004). Toward a basic framework for webometrics. Journal of the American Society for Information Science and Technology, 55(14), 1216–1227.CrossRefGoogle Scholar
  6. Bollen, J., Van de Sompel, H., Hagberg, A., & Chute, R. (2009). A principal component analysis of 39 scientific impact measures. PLoS ONE, 4(6), e6022.CrossRefGoogle Scholar
  7. Bornmann, L., & Leydesdorff, L. (2013). The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000. Journal of Informetrics, 7(2), 286–291.Google Scholar
  8. Clemens, E. S., Powell, W. W., McIlwaine, K., & Okamoto, D. (1995). Careers in print: Books, journals, and scholarly reputations. American Journal of Sociology, 433-494.Google Scholar
  9. Fry, J. (2006). Scholarly research and information practices: a domain analytic approach. Information Processing and Management, 42(1), 299–316.CrossRefGoogle Scholar
  10. Fry, J., & Talja, S. (2004). The cultural shaping of scholarly communication: Explaining e-journal use within and across academic fields. Proceedings of the American society for information science and technology, 41(1), 20–30.CrossRefGoogle Scholar
  11. Haustein, S., & Siebenlist, T. (2011). Applying social bookmarking data to evaluate journal usage. Journal of Informetrics, 5(3), 446–457.Google Scholar
  12. Huang, M., & Chang, Y. (2008). Characteristics of research output in social sciences and humanities: From a research evaluation perspective. Journal of the American Society for Information Science and Technology, 59(1), 1819–1828.CrossRefGoogle Scholar
  13. Kousha, K., & Thelwall, M. (2008). Assessing the impact of disciplinary research on teaching: An automatic analysis of online syllabuses. Journal of the American Society for Information Science and Technology, 59(13), 2060–2069.CrossRefGoogle Scholar
  14. Li, X., Thelwall, M., & Giustini, D. (2012). Validating online reference managers for scholarly impact measurement. Scientometrics, 91(2), 461–471.CrossRefGoogle Scholar
  15. Meho, L. I., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105–2125.CrossRefGoogle Scholar
  16. Must, Ü. (2012). Alone or together: Examples from history research. Scientometrics, 92(2), 527–537.CrossRefGoogle Scholar
  17. Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66(1), 81–100.CrossRefMathSciNetGoogle Scholar
  18. Nederhof, A., & Erlings, C. (1993). A bibliometric study of productivity and impact of modern language and literature research in the Netherlands, 1982-1991. Leiden: report CWTS-93-09.Google Scholar
  19. Nederhof, A. J., Zwaan, R. A., De Bruin, R. E., & Dekker, P. J. (1988). Accessing the useful of bibliometric indicators for the humanities and the social and behavioral sciences: A comparative study. Scientometrics, 15(5–6), 423–435.Google Scholar
  20. Nielsen, F. Å. (2007). Scientific citations in Wikipedia. First Monday, 12(8).Google Scholar
  21. Norris, M., & Oppenheim, C. (2007). Comparing alternatives to the Web of Science for coverage of the social sciences’ literature. Journal of Informetrics, 1(2), 161–169.CrossRefGoogle Scholar
  22. Piwowar, H. (2013). Value all research products. Nature, 496, 159.CrossRefGoogle Scholar
  23. Priem, J., & Costello, K. L. (2010). How and why scholars cite on twitter. Proceedings of the American Society for Information Science and Technology, 47(1), 1–4.CrossRefGoogle Scholar
  24. Priem, J. & Hemminger, B. (2010). Scientometrics 2.0: Toward new metrics of scholarly impact on the social web. First Monday, 15(7).Google Scholar
  25. Priem, J., Piwowar, H., & Hemminger, B. (2011). Altmetrics in the wild: An exploratory study of impact metrics based on social media. Presented at Metrics 2011: Symposium on Informetric and Scientometric Research. New Orleans, LA, USA, October 12.Google Scholar
  26. Priem, J., Taraborelli, D., Groth, P. & Neylon, C. (2010). Altmetrics: A manifesto. Rerieved from:
  27. Rousseau, R. & Ye, F. (2013). A multi-metrics approach for research evaluation. Chinese Science Bulletin, 10-12.Google Scholar
  28. Shema, H., Bar‐Ilan, J., & Thelwall, M. (2014). Do blog citations correlate with a higher number of future citations? Research blogs as a potential source for alternative metrics. Journal of the Association for Information Science and Technology, 65(5), 1018–1027.Google Scholar
  29. Sud, P., & Thelwall, M. (2014). Evaluating altmetrics. Scientometrics, 98(2), 1131–1143.CrossRefGoogle Scholar
  30. Thelwall, M. (2008). Bibliometrics to webometrics. Journal of information science, 34(4), 605–621.CrossRefGoogle Scholar
  31. Thelwall, M., & Harries, G. (2004). Do the Web sites of higher rated scholars have significantly more online impact? Journal of the American Society for Information Science and Technology, 55(2), 149–159.CrossRefGoogle Scholar
  32. Van der Meulen, B., & Leydesdorff, L. (1991). Has the study of philosophy at Dutch universities changed under economic and political pressures? Science, Technology and Human Values, 16(3), 288–321.CrossRefGoogle Scholar
  33. Waltman, L., & Costas, R. (2014). F1000 Recommendations as a potential new data source for research evaluation: A comparison with citations. Journal of the Association for Information Science and Technology, 65(3), 433–445.Google Scholar
  34. White, H. D., et al. (2009). Libcitations: A measure for comparative assessment of book publications in the humanities and social sciences. Journal of the American Society for Information Science and Technology, 60(6), 1083–1096.CrossRefGoogle Scholar
  35. Zahedi, Z., Costas, R., & Wouters, P. (2014). How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications. Scientometrics, Advance online publication,. doi: 10.1007/s11192-014-1264-0.zbMATHGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2014

Authors and Affiliations

  • Kuang-hua Chen
    • 1
  • Muh-chyun Tang
    • 1
  • Chun-mei Wang
    • 1
  • Jieh Hsiang
    • 2
  1. 1.Department of Library and Information ScienceNational Taiwan UniversityTaipeiTaiwan, R.O.C
  2. 2.Department of Computer Science and Information EngineeringNational Taiwan UniversityTaipeiTaiwan

Personalised recommendations