Skip to main content
Log in

Exploring alternative metrics of scholarly performance in the social sciences and humanities in Taiwan

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

Research output and impact metrics derived from commercial citation databases such as Web of Science and Scopus have become the de facto indicators of scholarly performance across different disciplines and regions. However, it has been pointed out that the existing metrics are largely inadequate to reflect scholars’ overall peer-mediated performance, especially in the social sciences and humanities (SSH) where publication channels are more diverse. In this paper alternative metrics exploring a variety of formal and informal communication channels were proposed, with the aim of better reflecting SSH scholarship. Data for a group of SSH scholars in Taiwan on these metrics were collected. Principal component analysis revealed four underlying dimensions represented by the 18 metrics. Multiple-regression analyses were then performed to examine how well each of these dimensions predicted the academic standing of the scholars, measured by the number of public grants awarded and prestigious research awards received. Differences in the significance of the predictors were found between the social sciences and humanities. The results suggest the need to consider disciplinary differences when evaluating scholarly performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Aguillo, I. F., Granadino, B., Ortega, J. L., & Prieto, J. A. (2006). Scientific research activity and communication measured with cybermetrics indicators. Journal of the American Society for Information Science and Technology, 57(10), 1296–1302.

    Article  Google Scholar 

  • Aguillo, I. F., Ortega, J. L., & Fernández, M. (2008). Webometric ranking of world universities: Introduction, methodology, and future developments. Higher Education in Europe, 33(2–3), 233–244.

    Article  Google Scholar 

  • Allen, L., Jones, C., Dolby, K., Lynn, D., & Walport, M. (2009). Looking for landmarks: the role of expert review and bibliometric analysis in evaluating scientific publication outputs. PLoS One, 4(6), e5910.

  • Bar-Ilan, J., Haustein, S., Peters, I., Priem, J., Shema, H., & Terliesner, J. (2012). Beyond citations: Scholars’ visibility on the social Web. arXiv preprint arXiv:1205.5611.

  • Björneborn, L., & Ingwersen, P. (2004). Toward a basic framework for webometrics. Journal of the American Society for Information Science and Technology, 55(14), 1216–1227.

    Article  Google Scholar 

  • Bollen, J., Van de Sompel, H., Hagberg, A., & Chute, R. (2009). A principal component analysis of 39 scientific impact measures. PLoS ONE, 4(6), e6022.

    Article  Google Scholar 

  • Bornmann, L., & Leydesdorff, L. (2013). The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000. Journal of Informetrics, 7(2), 286–291.

  • Clemens, E. S., Powell, W. W., McIlwaine, K., & Okamoto, D. (1995). Careers in print: Books, journals, and scholarly reputations. American Journal of Sociology, 433-494.

  • Fry, J. (2006). Scholarly research and information practices: a domain analytic approach. Information Processing and Management, 42(1), 299–316.

    Article  Google Scholar 

  • Fry, J., & Talja, S. (2004). The cultural shaping of scholarly communication: Explaining e-journal use within and across academic fields. Proceedings of the American society for information science and technology, 41(1), 20–30.

    Article  Google Scholar 

  • Haustein, S., & Siebenlist, T. (2011). Applying social bookmarking data to evaluate journal usage. Journal of Informetrics, 5(3), 446–457.

    Google Scholar 

  • Huang, M., & Chang, Y. (2008). Characteristics of research output in social sciences and humanities: From a research evaluation perspective. Journal of the American Society for Information Science and Technology, 59(1), 1819–1828.

    Article  Google Scholar 

  • Kousha, K., & Thelwall, M. (2008). Assessing the impact of disciplinary research on teaching: An automatic analysis of online syllabuses. Journal of the American Society for Information Science and Technology, 59(13), 2060–2069.

    Article  Google Scholar 

  • Li, X., Thelwall, M., & Giustini, D. (2012). Validating online reference managers for scholarly impact measurement. Scientometrics, 91(2), 461–471.

    Article  Google Scholar 

  • Meho, L. I., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105–2125.

    Article  Google Scholar 

  • Must, Ü. (2012). Alone or together: Examples from history research. Scientometrics, 92(2), 527–537.

    Article  Google Scholar 

  • Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66(1), 81–100.

    Article  MathSciNet  Google Scholar 

  • Nederhof, A., & Erlings, C. (1993). A bibliometric study of productivity and impact of modern language and literature research in the Netherlands, 1982-1991. Leiden: report CWTS-93-09.

  • Nederhof, A. J., Zwaan, R. A., De Bruin, R. E., & Dekker, P. J. (1988). Accessing the useful of bibliometric indicators for the humanities and the social and behavioral sciences: A comparative study. Scientometrics, 15(5–6), 423–435.

    Google Scholar 

  • Nielsen, F. Å. (2007). Scientific citations in Wikipedia. First Monday, 12(8).

  • Norris, M., & Oppenheim, C. (2007). Comparing alternatives to the Web of Science for coverage of the social sciences’ literature. Journal of Informetrics, 1(2), 161–169.

    Article  Google Scholar 

  • Piwowar, H. (2013). Value all research products. Nature, 496, 159.

    Article  Google Scholar 

  • Priem, J., & Costello, K. L. (2010). How and why scholars cite on twitter. Proceedings of the American Society for Information Science and Technology, 47(1), 1–4.

    Article  Google Scholar 

  • Priem, J. & Hemminger, B. (2010). Scientometrics 2.0: Toward new metrics of scholarly impact on the social web. First Monday, 15(7).

  • Priem, J., Piwowar, H., & Hemminger, B. (2011). Altmetrics in the wild: An exploratory study of impact metrics based on social media. Presented at Metrics 2011: Symposium on Informetric and Scientometric Research. New Orleans, LA, USA, October 12.

  • Priem, J., Taraborelli, D., Groth, P. & Neylon, C. (2010). Altmetrics: A manifesto. Rerieved from: http://altmetrics.org/manifesto/.

  • Rousseau, R. & Ye, F. (2013). A multi-metrics approach for research evaluation. Chinese Science Bulletin, 10-12.

  • Shema, H., Bar‐Ilan, J., & Thelwall, M. (2014). Do blog citations correlate with a higher number of future citations? Research blogs as a potential source for alternative metrics. Journal of the Association for Information Science and Technology, 65(5), 1018–1027.

  • Sud, P., & Thelwall, M. (2014). Evaluating altmetrics. Scientometrics, 98(2), 1131–1143.

    Article  Google Scholar 

  • Thelwall, M. (2008). Bibliometrics to webometrics. Journal of information science, 34(4), 605–621.

    Article  Google Scholar 

  • Thelwall, M., & Harries, G. (2004). Do the Web sites of higher rated scholars have significantly more online impact? Journal of the American Society for Information Science and Technology, 55(2), 149–159.

    Article  Google Scholar 

  • Van der Meulen, B., & Leydesdorff, L. (1991). Has the study of philosophy at Dutch universities changed under economic and political pressures? Science, Technology and Human Values, 16(3), 288–321.

    Article  Google Scholar 

  • Waltman, L., & Costas, R. (2014). F1000 Recommendations as a potential new data source for research evaluation: A comparison with citations. Journal of the Association for Information Science and Technology, 65(3), 433–445.

  • White, H. D., et al. (2009). Libcitations: A measure for comparative assessment of book publications in the humanities and social sciences. Journal of the American Society for Information Science and Technology, 60(6), 1083–1096.

    Article  Google Scholar 

  • Zahedi, Z., Costas, R., & Wouters, P. (2014). How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications. Scientometrics, Advance online publication,. doi:10.1007/s11192-014-1264-0.

    MATH  Google Scholar 

Download references

Acknowledgments

The study was sponsored by “The Aim for the Top University Project, Integrated Platform of Digital Humanities” at National Taiwan University in Taiwan.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Muh-chyun Tang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, Kh., Tang, Mc., Wang, Cm. et al. Exploring alternative metrics of scholarly performance in the social sciences and humanities in Taiwan. Scientometrics 102, 97–112 (2015). https://doi.org/10.1007/s11192-014-1420-6

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-014-1420-6

Keywords

Navigation