There is considerable interest in the ranking of journals, given the intense pressure to place articles in the “top” journals. In this article, a new index, h, and a new source of data—Google Scholar – are introduced, and a number of advantages of this methodology to assessing journals are noted. This approach is attractive because it provides a more robust account of the scholarly enterprise than do the standard Journal Citation Reports. Readily available software enables do-it-yourself assessments of journals, including those not otherwise covered, and enable the journal selection process to become a research endeavor that identifies particular articles of interest. While some critics are skeptical about the visibility and impact of sociological research, the evidence presented here indicates that most sociology journals produce a steady stream of papers that garner considerable attention. While the position of individual journals varies across measures, there is a high degree commonality across these measurement approaches. A clear hierarchy of journals remains no matter what assessment metric is used. Moreover, data over time indicate that the hierarchy of journals is highly stable and self-perpetuating. Yet highly visible articles do appear in journals outside the set of elite journals. In short, the h index provides a more comprehensive picture of the output and noteworthy consequences of sociology journals than do than standard impact scores, even though the overall ranking of journals does not markedly change.
This is a preview of subscription content, access via your institution.
Buy single article
Instant access to the full article PDF.
Tax calculation will be finalised during checkout.
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
Tax calculation will be finalised during checkout.
The use of citation counts in evaluations remains controversial, whether it is done directly or via journal rankings as a proxy (van Raan 1996; MacRoberts and MacRoberts 1996; Seglen 1997; Garfield 2006; see Holden et al. 2006 for a number of recent references). In an appendix to this report, I discuss a key issue in the use of individual citations at the tenure decision. The basic problem, at least in the social sciences, is that the impact of research papers cannot be fully assessed until well after the tenure decision needs to be made.
The mean exposure time in the standard impact score is one year. For example, the 2008 impact score for a journal is based on citations to papers published in 2006 and 2007. The papers published at the beginning of 2006 thus have almost two years to garner references, but those published at the end of 2007 have only a few months. Similarly, the five-year impact score discussed below has a mean exposure time of 2.5 years, and thus does not capture five full years of citation exposure.
Scopus is yet another potential data source for journal comparisons (Leydesdorff, Moya-Anegon and Guerrero-Bote, 2010). I prefer Google Scholar because of its inclusion of references in books, and because it covers materials published over a longer time frame.
Unfortunately, PoP is not well suited for estimating the proportion of papers rarely if ever cited. That is because it often includes a number of variant references or citations, which generates a “tail” of entries with zero, one or two citations.
The International Review of Sociology has been published since 1893, two years before the American Journal of Sociology.
The Du Bois Review has only been published since 2004; it has achieved an h score of 11 over a six year period.
It should be noted that the average “exposure” time for a paper to be cited was 5 years, since the papers were published throughout the 10 year period covered. The most cited papers are concentrated among those published in the earliest years of the decade because they had the most time to be read and absorbed.
Two journals—the Annals of Tourism and Cornell Hospital Quarterly -- were removed on substantive grounds.
Another way to view this association recognizes that the five-year factor score includes the two year score. It may be useful to examine the relationship between the first two years of citation with the subsequent three years of citations. This involves subtracting the impact score from the five-year impact score and correlating the former with the remainder. This association is weaker but still substantial (r = .80).
Francois Nielsen, former editor of Social Forces, notes that Social Forces ranks higher on the eigenfactor metric. This measure weights citations by ‘quality,’ ie the ranking of the citing journal. This type of adjustment would be difficult to implement with Google Scholar, since one would have to weight not just journals but citations appearing in books and other sources as well.
The g-index is the (unique) largest number such that the top g articles received (together) at least g2 citations.
In terms of data errors, h is a bit less vulnerable to incorrect and variant citations. While each such error would affect g, h only depends on the accuracy of citation counts of papers close to the value of h. In other words, errors in the citation counts of very highly-cited and very rarely-cited papers will not affect the measured value of h.
It should also be noted that the statistics reported here also do not adjust for the number of articles published by each journal.
The ratio of h for the top 20 journals versus the remaining 108 published during both periods declines from 2.7 to 2.4, but this difference is too small to be statistically significant (treating this set of journals as a statistical sample).
An entry to “Reflexive Modernization” by Ulrich Beck, Anthony Giddens and Scott Lash in the journal Theory, Culture & Society garnered 783 citations. This reference, however, is to a special issue of the journal rather than a single article.
An earlier draft of this paper cited an essay by Samuel Bowles and Herbert Gintis entitled “Schooling in Capitalist America Revisited” as the most frequently cited paper. Unfortunately, the references to this article, published in the journal Sociology of Education, appear to be conflated with references to the with the same title published by these authors a quarter of a century earlier.
Ronald Inglehart is a political scientist by training but his research on “post-materialist” values is quite prominent in sociology. Gautam Ahuja is a management professor; his highly cited paper seeks to build on the research by Ronald Burt, a noted sociologist of networks. Perhaps the paper that “sticks out” the most is the paper by Filmer and Pritchett on wealth effects in the journal Demography. This paper examines the impact of household wealth on schooling in India. While this topic is in principle of interest to sociologists, this article has been of greater interest to scholars in other fields. Based on the ISI classification of the citing journals, Filmer and Pritchett paper is most popular in public health, tropical medicine, economics and demography, with only 2 % of the citations appearing in sociology journals.
Adkins, D., & Budd, J. (2006). Scholarly productivity of U.S. LIS faculty. Library & Information Science Research, 28(3), 374–389.
Allen, M. P. (2003). “The ‘core influence’ of journals in sociology revisited.” Footnotes (American Sociological Association Newsletter). December. http://www.asanet.org/footnotes/dec03/fn11.html
Borgman, C. L., & Furner, J. (2002). Scholarly communication and bibliometrics. Annual Review of Information Science and Technology, 36, 3–72.
Bornmann, L., & Daniel, H.-D. (2007). What do we know about the h index?”. Journal of the American Society for Information Science and Technology, 58(9), 1391–1385.
Clemens, E. S., Powell, W. W., McIlwaine, K., & Okamoto, D. (1995). Careers in print: books, journals, and scholarly reputations. American Journal of Sociology, 101(2), 433–494.
Cronin, B., Snyder, H., & Atkins, H. (1997). Comparative citation rankings of authors in monographic and journal literature: a study of sociology.”. Journal of Documentation, 53(3), 263–273.
Duncan, O. D. (1961). A socioeconomic index for all occupations. In A. J. Reiss (Ed.), Occupations and social status (pp. 109–138). New York: Free Press.
Egghe, L. (2006). Theory and practice of the g-index. Scientometrics, 69(1), 131–152.
Espeland, W. & Sauder, M. (2016). By the numbers: how media rankings changed legal education in America (Russell Sage Foundation).
Frodeman, R. (2010). Introduction. In R. Frodeman (Ed.), Oxford handbook of interdisciplinarity. Oxford: Oxford University Press.
Garfield, E. (2006). The history and meaning of the journal impact factor. JAMA (Journal of the American Medical Association), 295(1), 90–93 (January 4).
Giles, M. W., & Garand, J. C. (2007). Ranking political science journals: reputational and citational approaches. Political Science & Politics, 40(4), 741–751.
Google Scholar. (2015). http://scholar.google.com/.
Harzing, A. W. (2011). The publish or perish book. Your guide to effective and responsible citation analysis. Tamara Software Research (Publisher). Available online.
Harzing, A. W. (2015). Publish or perish, Version 4, available at www.harzing.com/pop.htm.
Harzing, A.-W., & van der Wal, R. (2009). A Google Scholar h-index for journals: an alternative metric t o measure journal impact in economics and business. Journal of the American Society for Information Science & Technology, 60(1), 41–46.
Hirsch, J. E. (2005). “An index to quantify an individual’s scientific research output”. PNAS Proceedings of the National Academy of Science of the United States of America, 102(46), 16569–16572.
Holden, G., Rosenberg, G., Barker, K., & Onghena, P. (2006). “An assessment of the predictive validity of impact factor scores: implications for academic employment decisions in social work.”. Research on Social Work Practice, 16(6), 613–624.
ISI Web of Science. (2015). www.isiknowledge.com/.
Jacobs, J. A. (2005). ASR’s greatest hits. American Sociological Review, 70(1), 1–4.
Jacobs, J. A. (2007). Further reflections on ASR’s greatest hits. The American Sociologist, 38(1), 99–131. Also available on the American Sociological Review webpage, http://www.asanet.org/journals/asr/2005/043sup1.pdf.
Jacobs, J. A. (2009). Where credit is due: assessing the visibility of articles published in gender & society with Google Scholar. Gender & Society, 23(6), 817–832.
Jacobs, J. A. (2013). In defense of disciplines: Interdisciplinarity and specialization in the research university. Chicago: University of Chicago Press.
Jacobs, J. A., & Frickel, S. (2009). Interdisciplinarity: a critical assessment. Annual Review of Sociology, 35, 43–66. http://arjournals.annualreviews.org/eprint/yyThexc9vmVNN4DkFjKC/full/10.1146/annurev-soc-070308-115954.
Kalaitzidakis, P., Mamuneas, T. P., & Stengos, T. (2003). Rankings of academic journals and institutions in economics. Journal of the European Economoic Association, 1(6), 1346–1366.
Larivière, V., Gingras, Y., & Archambault, É. (2009). The decline in the concentration of citations, 1900–2007. Journal of the American Society for Information Science and Technology, 60(4), 858–862.
Leydesdorfff, L. (2009). How are new citation-based journal indicators adding to the bibliometric toolbox?”. Journal of the American Society for Information Science & Technology, 60(7), 1327–1336.
Lluch, J. O. (2005). “Some considerations on the use of the impact factor of scientific journals as a tool to valuate research in psychology.” Scientometrics 65(2):189–197
Luzer, D. (2013). No one really reads academic papers.” Washington Monthly. February 19. http://www.washingtonmonthly.com/college_guide/blog/academics_do_a_lot_of.php
MacRoberts, M. H., & MacRoberts, B. R. (1996). Problems of citation analysis. Scientometrics, 36(3), 435–444.
Merton, R. K. (1968). The Mathew effect in science.” Science 159, no. 3810, 5 January. In R. K. Merton (Ed.), The sociology of science: Theoretical and empirical investigations (pp. 56–63). Chicago: University of Chicago Press.
Mingers, J., & Harzing, A.-W. (2007). Ranking journals in business and management: a statistical analysis of the Harzing data set. European Journal of Information Systems, 16(4), 303–316.
Moed, H. F. (2005). “Citation analysis of scientific journals and journal impact measures.” Current Science 89(12) 25 December 1990–1996.
Moed, H. F., & Van Leeuwen, T. N. (1995). Improving the accuracy of institute for scientific information’s journal impact factors. Journal of the American Society for Information Science, 46(6), 461–467.
Seglen, P. O. (1997). Why the impact factor of journals should not be used for evaluating research. British Medical Journal, 314(February 15), 498–502.
Shapiro, F. R. (2000). The most-cited law reviews. Journal of Legal Studies, 29, 1540–1554.
van Raan, A. F. J. (1996). Advanced bibliometric methods as quantitative core of peer-review based evaluation and foresight exercises. Scientometrics, 36, 397–420.
van Raan, A. F. J. (2005). Fatal attraction: conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.
van Raan, A. F. J. (2006). Comparison of the Hisch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups. Scienometrics, 67(3), 491–5002.
About this article
Cite this article
Jacobs, J.A. Journal Rankings in Sociology: Using the H Index with Google Scholar. Am Soc 47, 192–224 (2016). https://doi.org/10.1007/s12108-015-9292-7
- Sociology journals
- Journal rankings
- H index
- Google scholar