Advertisement

Abstract

This chapter reviews bibliometric studies of the social sciences and humanities. SSCI bibliometrics will work reasonably well in economics and psychology, whose literatures share many characteristics with science, and less well in sociology, characterised by a typical social science literature. The premise of the chapter is that quantitative evaluation of research output faces severe methodological difficulties in fields whose literature differs in nature from scientific literature. Bibliometric evaluations are based on international journal literature indexed in the SSCI, but social scientists also publish books, write for national journals and for the non-scholarly press. These literatures form distinct, yet partially overlapping worlds, each serving a different purpose. For example, national journals communicate with a local scholarly community, and the non-scholarly press represents research in interaction with contexts of application. Each literature is more trans-disciplinary than its scientific counterpart, which itself poses methodological challenges. The nature and role of each of the literatures will be explored here, and the chapter will argue that by ignoring the three other literatures of social science bibliometric evaluation produces a distorted picture of social science fields.

Keywords

Social Science Journal Article Research Output Citation Rate Bibliometric Indicator 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bourke, P., Butler, L., Biglia, B. (1996). Monitoring research in the Periphery: Australia and the ISI Indices. Research Evaluation and Policy Project, Monograph Series No. 3, Canberra: Australian National University.Google Scholar
  2. Broadus, R.N. (1971). The literature of the Social Sciences: a survey of citation studies. International Social Sciences Journal, 23, 236–243.Google Scholar
  3. Burnhill, P.M., Tubby-Hille M.E. (1994). On measuring the relation between social science research activity and research publication. Research Evaluation, 4 (3), 130–152.Google Scholar
  4. Butler, L. (1998). Personal communication of unpublished data.Google Scholar
  5. Clemens, E.S., Powell, W.W., McIlwaine, K., Okamoto, D. (1995). Careers in print: books, journals, and scholarly reputations. The American Journal of Sociology, 101 (2), 433–494.Google Scholar
  6. Cronin, B., Snyder, H., Atkins, H. (1997). Comparative citation rankings of authors in monographic and journal literature: a study of Sociology. Journal of Documentation, 53 (3), 263–273.CrossRefGoogle Scholar
  7. De Solla Price, D.J. (1970). Citation measures of hard science, soft science, technology, and non-science. In C.E. Nelson, D.K. Pollak (Eds.), Communication among Scientists and Engineers (pp. 1–12). Lexington, Mass: Heat.Google Scholar
  8. Gibbons, M., Limoges, C., Nowotny, H., Schwartzman, S., Scott, P., Trow, M. (1994). The new production of knowledge. The dynamics of science and research in contemporary societies. London: Sage.Google Scholar
  9. Glänzel W., Schoepflin, U. (1994). A stochastic model for the ageing analyses of scientific literature. Scientometrics, 30 (1), 49–64.CrossRefGoogle Scholar
  10. Glänzel W., Schoepflin, U. (1995). A bibliometric study on ageing and reception processes of scientific literature. Journal of Information Science, 21 (1), 37–53.Google Scholar
  11. Glänzel, W. (1996). A bibliometric approach to social sciences. National research performances in six selected social science areas, 1990–1992. Scientometrics, 35 (3), 291–307.CrossRefGoogle Scholar
  12. Glänzel W., Schoepflin, U. (1999). A bibliometric study of reference literature in the sciences and social sciences. Information Processing and Management, 35, 31–44.CrossRefGoogle Scholar
  13. Glänzel W., Schubert, A., Schoepflin, U., Czerwon, H.J. (1999). An item-by-item subject classification of papers published in journals covered by the SSCI database using reference analysis. Scientometrics, 46 (3), 431–441.Google Scholar
  14. Godin, B. (2002). The social sciences in Canada: what can we learn from bibliometrics? INRS, Working Paper no 1. Quebec, Canada: INRS.Google Scholar
  15. Hicks, D., Potter, J. (1991). Sociology of scientific knowledge: a reflexive citation analysis or science disciplines and disciplining science. Social Studies of Science, 21, 459–501.Google Scholar
  16. Hicks, D. (1999). The difficulty of achieving full coverage of international social science literature and the bibliometric consequences. Scientometrics, 44 (2), 193–215.Google Scholar
  17. Hicks, D. Breitzman, A., Olivastro, D., Hamilton, K. (2001). The changing composition of innovative activity in the U.S.—a portrait based on patent analysis. Research Policy, 30, 681–703.CrossRefGoogle Scholar
  18. Ingwersen, P. (1997). The central international visibility of Danish and Scandinavian research 1988–1996. A general overview of science & technology, the humanities and social sciences by online publication analysis., 17 p. (CIS Report 5.3).Google Scholar
  19. Ingwersen, P., Wormell, I. (1999). Publication behaviour and international impact: Scandinavian clinical and social medicine 1988–96. Scientometrics, 46 (3), 487–499.Google Scholar
  20. Ingwersen, P. (2000). The international visibility and citation impact of Scandinavian research articles in selected social science fields: the decay of a myth. Scientometrics, 49, 39–61.CrossRefGoogle Scholar
  21. Ingwersen, P. (2002). Visibility and impact of research in psychiatry for North European countries in EU, US and world contexts. Scientometrics, 54, 131–144.CrossRefGoogle Scholar
  22. Katz, J.S. (1999). Bibliometric indicators and the social sciences. Report prepared for UK economic and social research council, www.sussex.ac.uk/Users/sylvank/pubs/ESRC.pdfGoogle Scholar
  23. Kyvik, S. (1988). Internationality of the social sciences: the Norwegian case. International Social Science Journal, 163–172.Google Scholar
  24. Lewison, G. (2001). Evaluation of books as research outputs in history of medicine. Research Evaluation, 10 (2), 89–95.Google Scholar
  25. Leydesdorff, L. (2003). Can networks of journal-journal citation be used as indicators of change in the social sciences? Journal of Documentation, 59 (1), 84–104.CrossRefGoogle Scholar
  26. Line, M.B. (1979). The influence of the type of sources used on the results of citation analyses. Journal of Documentation, 35 (4), 265–284.Google Scholar
  27. Luwel, M., Moed, H.F. Nederhof, A.J. De Samblanx, V. Verbrugghen, K., Van Der Wurff, L.J. (1999). Towards indicators of research performance in the social sciences and humanities. An exploratory study in the fields of Law and Linguistics at Flemish Universities. Report of the Flemish Inter-University Council (V.L.I.R.), Brussels, Belgium / Centre for Science and Technology Studies (CWTS), Leiden University, the Netherlands / Ministry of the Flemish Community, Brussels, Belgium. V.L.I.R.: Brussels, Belgium.Google Scholar
  28. Moed, H.F., Nederhof, A.J., Luwel, M. (2002). Towards performance in the humanities. Library Trends (Special Issue on current theory in library and information science). 50, 498–520.Google Scholar
  29. Narin, F. (1994). Patent Bibliometrics. Scientometrics, 30 (1), 147–155.CrossRefGoogle Scholar
  30. Narin, F., Hamilton, K.S., Olivastro, D. (1997). The increasing linkage between U.S. technology and public science. Research Policy, 26(3), 317–330.CrossRefGoogle Scholar
  31. Nederhof, A.J. (1989). Books and chapters are not to be neglected in measuring rsearch productivity. American Psychologist, 44, 734–735.CrossRefGoogle Scholar
  32. Nederhof, A.J., Zwaan, R.A. DeBruin, R.E. Dekker, P.J. (1989). Assessing the usefulness of bibliometric indicators for the humanities and the social and behavioural sciences: a comparative study. Scientometrics, 15 (5–6), 423–435.Google Scholar
  33. Nederhof, A.J., Zwaan, R.A. (1991). Quality judgments of journals as indicators of research performance in the humanities and the social and behavioral sciences. Journal of the American Society for Information Science, 42 (5), 332–340.CrossRefGoogle Scholar
  34. Nederhof, A.J., Meijer, R.F. Moed, H.F., Van Raan, A.F.J. (1993). Research performance indicators for university departments: a study of an agricultural university. Scientometrics, 27 (2), 157–178.CrossRefGoogle Scholar
  35. Nederhof, A.J., Van Raan, A.F.J. (1993). A bibliometric analysis of six economics research groups: a comparison with peer review. Research Policy, 22, 353–368.CrossRefGoogle Scholar
  36. Nederhof, A.J., Van Wijk, E. (1997). Mapping the social and behavioral sciences world-wide: use of maps in portfolio analysis of national research efforts. Scientometrics, 40 (2), 237–276.Google Scholar
  37. Nederhof, A.J., Van Wijk, E. (1999). Profiling institutes: identifying high research performance and social relevance in the social and behavioral sciences. Scientometrics, 44 (3), 487–506.Google Scholar
  38. Nederhof, A.J., Luwel, M., Moed, H.F. (2001). Assessing the quality of scholarly journals in linguistics: An alternative to citation-based journal impact factors. Scientometrics 51 (1), 241–265.CrossRefGoogle Scholar
  39. Nowotny, H., Scott, P., Gibbons, M. (2001). Re-thinking Science. Cambridge, UK: Polity Press.Google Scholar
  40. Nowotny, H., Scott, P., Gibbons, M. (2003). Re-thinking science: mode 2 in societal context. http://www.nowotny.ethz.ch/pdf/Nowotny_Gibbons_Scott_Mode2.pdf.
  41. Pestaña, A., Gómez, I., Fernández, M.T., Zulueta, M.A., Méndez A. (1995). Scientometric evaluation of R&D activities in medium—size institutions: a case study based on the Spanish Scientific Research Council (CSIC). In M. Koenig, A. Bookstein (Eds.), The Proceedings of the Fifth International Conference of the International Society for Scientometrics and Informetrics (pp. 425–434).Google Scholar
  42. Pierce, S. (1987). Characteristics of professional knowledge structures: some theoretical implications of citation studies. Library and Information Science Review (LISR), 9, 143–171.Google Scholar
  43. Royle, P., Over, R. (1994). The use of bibliometric indicators to measure the research productivity of Australian Academics. Australian Academic & Research Libraries, 25 (2), 77–88.Google Scholar
  44. Schoepflin, U. (1990). Problems of representativity in the Social Sciences Citation Index. In P. Weingart, R. Sehringer, M. Winterhager (Eds.), Representations of science and technology, Proceedings of the International Conference on Science and Technology Indicators, Bielefeld, Germany, 10–12 June, 1992 (pp. 177–188). Leiden: DSWO Press.Google Scholar
  45. Small, H., Crane, D., (1979). Specialties and disciplines in science and social science: an examination of their structure using citation indexes. Scientometrics 1 (5–6), 445–61.Google Scholar
  46. Thomas, P. (1998). A bibliometric analysis of fashions in management literature, PhD thesis, Nottingham Trent University.Google Scholar
  47. Thorsteinsdottir, H. (1998). Islands Reaching Out, unpublished DPhil thesis, University of Sussex.Google Scholar
  48. Tijssen, R.J.W., Van Leeuwen, Th.N., Verspagen, B., Slabbers, M. (1996). Wetenschaps-en Technologie-Indicatoren 1996, Het Nederlands Observatorium van Wetenschap en Technologie: Centrum voor Wetenschaps-en Technologie Studies (CWTS) en Maastricht Economic Research Institute on Innovation and Technology (MERIT) in opdracht van het Ministerie van Onderwijs, Cultuur en Wetenschappen, Zoetermeer, (ISBN 90-75023-03-0), 223p.Google Scholar
  49. Van Der Meulen, B., Leydesdorff, L. (1991). Has the study of philosophy at Dutch universities changed under economic and political pressures? Science, Technology, & Human Values, 16 (3), 288–321.Google Scholar
  50. Villagrá Rubio, A., (1992). Scientific production of Spanish universities in the fields of social sciences and language. Scientometrics, 24 (1), 3–19.Google Scholar
  51. Webster, B.M. (1998). Polish sociology citation index as an example of usage of national citation indexes in scientometric analysis of social science. Journal of Information Science, 24 (1), 19–32.Google Scholar
  52. Winclawska, B.M. (1996). Polish sociology citation index (principles for creation and the first results). Scientometrics, 35 (3), 387–391.CrossRefGoogle Scholar
  53. Winterhager, M. (1994). Bibliometrische Basisdaten zur Entwicklung der Sozialwissenschaften in Deutschland. in H. Best, et al. (Hrsg.), Informations-und Wissensverarbeitung in den Sozialwissenschaften. Opladen 1994, 539–551.Google Scholar
  54. Yitzhaki, M. (1998). The language preference in sociology. Scientometrics, 41 (1–2), 243–254.Google Scholar
  55. Zwaan, R.A., Nederhof, A.J., (1990). Some aspects of scholarly communication in linguistics: An empirical study. Language, 66, 523–527.Google Scholar

Copyright information

© Kluwer Academic Publishers 2004

Authors and Affiliations

  • Diana Hicks
    • 1
  1. 1.School of Public PolicyGeorgia Institute of TechnologUSA

Personalised recommendations