Scientometrics

, Volume 101, Issue 2, pp 1195–1213 | Cite as

Which role do non-source items play in the social sciences? A case study in political science in Germany

Article

Abstract

Publications that are not indexed by citation indices such as Web of Science (WoS) or Scopus are called “non-source items”. These have so far been neglected by most bibliometric analyses. The central issue of this study is to investigate the characteristics of non-source items and the effect of their inclusion in bibliometric evaluations in the social sciences, specifically German political science publications. The results of this study show that non-source items significantly increase the number of publications (+1,350 %) and to a lesser extent the number of citations from SCIE, SSCI, and A&HCI (+150 %) for evaluated political scientists. 42 % of non-source items are published as book chapters.Edited books and books are cited the most among non-source items. About 40 % of non-source items are in English, while 80 % of source items are in English. The citation rates of researchers taking non-source items into account are lower than those from source items, partially as a result of the limited coverage of WoS. In contrast, the H-indices of researchers taking only non-source items into account are higher than those from source items. In short, the results of this study show that non-source items should be included in bibliometric evaluations, regardless of their impact or the citations from them. The demand for a more comprehensive coverage of bibliometric database in the social sciences for a higher quality of evaluations is shown.

Keywords

Non-source items Social sciences Publication patterns Research evaluation Political science 

Mathematics Subject Classification

01 94 

JEL Classification

A14 

References

  1. Aksnes, D. W. (2003). A micro study of self-citation. Scientometrics, 56(2), 235–246.CrossRefGoogle Scholar
  2. Albertsson, D., Kågedal, E. A. & Hammarfelt, B. (2013, October). Changing publication patterns in the humanities: a case study from the faculty of arts at Uppsala University. Paper Presented at the 18th Nordic Workshop on Bibliometrics and Research Policy, Stockholm, Sweden.Google Scholar
  3. Amez, L. (2013). Citation patterns for social sciences and humanities publications. In J. Gorraiz, E. Schiebel, C. Gumpenberger, M. Hörlesberger, & H. Moed (Eds.), Proceedings of the 14th international society of scientometrics and informetrics conference (Vol. II, pp. 1891–1893). Vienna: AIT GmbH.Google Scholar
  4. Andersen, H. (2000). Influence and reputation in the social sciences—how much do researchers agree? Journal of Documentation, 56(6), 674–692.CrossRefGoogle Scholar
  5. Archambault, E., Vignola-Gagne, E., Cote, G., Lariviere, V., & Gingras, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68(3), 329–342.CrossRefGoogle Scholar
  6. Borgman, C. L., & Furner, J. (2002). Scholarly communication and bibliometrics. Annual Review of Information Science and Technology, 36, 3–72.Google Scholar
  7. Bourke, P., & Butler, L. (1996). Publication types, citation rates and evaluation. Scientometrics, 37(3), 473–494.CrossRefGoogle Scholar
  8. Bourke, P., Butler, L., Biglia, B. (1996). Monitoring research in the periphery: Australia and the ISI indices. Research Evaluation and Policy Project, Monograph Series No. 3, Canberra: Australian National University.Google Scholar
  9. Burnhill, P. M., & Tubby-Hille, M. E. (1994). On measuring the relation between social science research activity and research publication. Research Evaluation, 4(3), 130–152.Google Scholar
  10. Butler, L., & Visser, M. S. (2006). Extending citation analysis to non-source items. Scientometrics, 66(2), 327–343.CrossRefGoogle Scholar
  11. CHE University Ranking 2010/11-Political science (2010). Retrieved 10. 08. 2010, 2010, from http://ranking.zeit.de/che2010/en/rankingkompakt?esb=28&hstyp=1.
  12. Chi, P. S. (2013). Do non-source items make a difference in the social sciences? In J. Gorraiz, E. Schiebel, C. Gumpenberger, M. Hörlesberger, H. Moed, (Eds.), In: Proceedings of the 14th International Society of Scientometrics and Informetrics Conference (Vol. I., pp. 612–625). Vienna: AIT GmbH.Google Scholar
  13. de Moya-Anegon, F., Chinchilla-Rodriguez, Z., Vargas-Quesada, B., Corera-Alvarez, E., Munoz-Fernandez, F., Gonzalez-Molina, A., et al. (2007). Coverage analysis of Scopus: A journal metric approach. Scientometrics, 73(1), 53–78.CrossRefGoogle Scholar
  14. Engels, T. C. E., Ossenblok, T. L. B., & Spruyt, E. H. J. (2012). Changing publication patterns in the socail sciences and humanities, 2000-2009. Scientometrics, 93(2), 373–390.CrossRefGoogle Scholar
  15. Glänzel, W., & Schoepflin, U. (1999). A bibliometric study of reference literature in the sciences and social sciences. Information Processing and Management, 35(3), 31–44.CrossRefGoogle Scholar
  16. Gorraiz, J., Purnell, P. J., & Glänzel, W. (2013). Opportunities for and limitations of the Book Citation Index. Journal of the American Society for Information Science and Technology, 64(7), 1388–1398.CrossRefGoogle Scholar
  17. Hicks, D. (1999). The difficulty of achieving full coverage of international social science literature and the bibliometric consequences. Scientometrics, 44(2), 193–215.CrossRefGoogle Scholar
  18. Hicks, D., & Potter, J. (1991). Sociology of scientific knowledge: A reflexive citation analysis or science disciplines and disciplining science. Social Studies of Science, 21(3), 459–501.CrossRefGoogle Scholar
  19. Hicks, D., & Wang, J. (2011). Coverage and overlap of the new social science and humanities journal lists. Journal of the American Society for Information Science and Technology, 62(2), 284–294.CrossRefGoogle Scholar
  20. Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences of the United States of America, 102(46), 16569–16572.CrossRefGoogle Scholar
  21. Hix, S. (2004). A global ranking of political science departments. Political Studies Review, 2(3), 293–313.CrossRefGoogle Scholar
  22. Luwel, M., Moed, H. F., Nederhof, A. J., De Samblanx, V., Verbrugghen, K., Van der Wurff, L. J. (1999). Toward indicators of research performance in the social sciences and humanities. An exploratory study in the fields of law and linguistics at Flemish universities. Report of the Flemish Inter-University Council (VLIR), Brussels, Belgium/Centre for science and technology studies (CWTS), Leiden university, the Netherlands/Ministry of the Flemish Community, Brussels, Belgium.Google Scholar
  23. Mabe, M. (2003). The growth and number of journals. Serials, 16(2), 191–197.Google Scholar
  24. Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A review. Scientometrics, 66(1), 81–100.MathSciNetCrossRefGoogle Scholar
  25. Nederhof, A. J., Meijer, R. F., Moed, A. F., & van Raan, A. F. J. (1993). Research performance indicators for university departments: A study of an agricultural university. Scientometrics, 27(2), 157–178.CrossRefGoogle Scholar
  26. Nederhof, A. J., Noyons, E. C. M. (1990). Trends in publication and international co-publication activity in the social and behavioral sciences and the humanities (1984–1989). Report CWTS-90-05. Leiden: Centre for Science and Technology Studies.Google Scholar
  27. Nederhof, A. J., & Zwaan, R. A. (1991). Quality judgments of journals as indicators of research performance in the humanities and the social and behavioral sciences. Journal of the American Society for Information Science, 42(5), 332–340.CrossRefGoogle Scholar
  28. Nederhof, A. J., Zwaan, R. A., De Bruin, R. E., & Dekker, P. J. (1989). Assessing the usefulness of bibliometric indicators for the humanities and the social and behavioural sciences: A comparative study. Scientometrics, 15(5–6), 423–435.CrossRefGoogle Scholar
  29. Norris, M., & Oppenheim, C. (2007). Comparing alternatives to the Web of Science for coverage of the social sciences’ literature. Journal of Informetrics, 1(2), 161–169.CrossRefGoogle Scholar
  30. Ossenblok, T. L. B., Engels, T. C. E., & Sivertsen, G. (2012). The representation of the social sciences and humanities in the Web of Science—A comparison of publication patterns and incentive structures in Flanders and Norway (2005–2009). Research Evaluation, 21(4), 280–290.CrossRefGoogle Scholar
  31. Pestaña, A., Gómez, I., Fernández, M. T., Zulueta, M. A., & Méndez, A. (1995). Scientometric evaluation of R&D activities in medium-size institutions : A case study based on the Spanish scientific research council. In M. Koenig & A. Bookstein (Eds.), Proceedings of the Fifth Biennial International Conference of the International Society for Scientometrics and Infometrics (pp. 425–434). Medford: Learned Information Inc.Google Scholar
  32. Phelan, T. J. (2000). Bibliometrics and the evaluation of Australian sociology. Journal of Sociology, 36(3), 345–363.CrossRefGoogle Scholar
  33. Samuels, D. J. (2011). The modal number of citations to a political science article is greater than zero: Accounting for citations in articles and books. PS. Political Science and Politics, 44(4), 783–792.MathSciNetCrossRefGoogle Scholar
  34. Samuels, D. J. (2013). Book citations count. PS. Political Science and Politics, 46(4), 785–790.CrossRefGoogle Scholar
  35. Schoepflin, U. (1992). Problems of representativity in the social sciences citation index. In P. Weingart, R. Sehringer, & M. Winterhager (Eds.), Representations of Science and Technology: Proceedings of the International Conference on Science and Technology Indicators (pp. 177–188). Bielefeld: DSWO Press.Google Scholar
  36. Sivertsen, G. (2009). Publication patterns in all fields. In: F. Astrom, R. Danell, B. Larsen, J. W. Schneider, (Eds.), Celebrating scholarly communication studies: A festschrift for Olle Persson at his 60th birthday. Special volume of the E-newsletter of the International Society for Scientometrics and Informetrics (pp. 55–60). ISSI.Google Scholar
  37. Sivertsen, G., & Larsen, B. (2012). Comprehensive bibliographic coverage of the social sciences and humanities in a citation index: an empirical analysis of the potential. Scientometrics, 91(2), 567–575.CrossRefGoogle Scholar
  38. van Leeuwen, T. N. (2006). The application of bibliometric analyses in the evaluation of social science research. Who benefits from it, and why it is still feasible. Scientometrics, 66(1), 133–154.CrossRefGoogle Scholar
  39. van Leeuwen, T. N. (2013). Bibliometric research evaluations, Web of Science and the social sciences and humanities: A problematic relationship? Bibliometrie-Praxis und Forschung, 2, 8-1–8-18.Google Scholar
  40. Wagner, C. S., & Wong, S. K. (2011). Unseen science? Representation of BRICs in global science. Scientometrics, 90(3), 1001–1013.CrossRefGoogle Scholar
  41. Winterhager, M. (1994). Bibliometrische Basisdaten zur Entwicklung der Sozialwissenschaften in Deutschland. In H. Best, B. Endres-Niggemeyer, M. Herfurth, & H. P. Ohly (Eds.), Informations- und Wissensverarbeitung in den Sozialwissenschaften. Opladen: Westdeutscher Verlag.Google Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2014

Authors and Affiliations

  1. 1.Institute for Research Information and Quality AssuranceBerlinGermany

Personalised recommendations