Skip to main content
Log in

Frame Search and Re-search: How Quantitative Sociological Articles Change During Peer Review

  • Published:
The American Sociologist Aims and scope Submit manuscript

Abstract

Peer review is a central institution in academic publishing, yet its processes and effects on research remain opaque. Empirical studies have (1) been rare because data on the peer review process are generally unavailable, and (2) conceptualized peer review as gate-keepers who either accept or reject a manuscript, overlooking peer review’s role in constructing articles. This study uses a unique data resource to study how sociological manuscripts change during peer review. Authors of published sociological research often present earlier versions of that research at annual meetings of the American Sociological Association (ASA). Many of these annual meetings papers are publicly available online and tend to be uploaded before undergoing formal peer review. A data sample is constructed by linking these papers to the respective versions published between 2006 and 2012 in two peer-reviewed journals, American Sociological Review and Social Forces. Quantitative and qualitative analyses examine changes across article versions, paying special attention to how elements of data analysis and theory in the ASA versions change. Results show that manuscripts tend to change more substantially in their theoretical framing than in the data analyses. The finding suggests that a chief effect of peer review in quantitative sociology is to prompt authors to adjust their theoretical framing, a mode or review I call “data-driven.” The data-driven mode of review problematizes the vision of sociological research as addressing theoretically motivated questions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. The literature on the shortcomings of peer review is voluminous. See (Bornmann 2011; Lamont 2009; Lee et al. 2013) for recent reviews.

  2. Recently a literature has started to coalesce around research practices, including evaluation, in the social sciences. See Camic, Gross, and Lamont 2011; Lamont 2009; Leahey 2008 for reviews and examples. Leahey (2008: 45) identifies the processes of framing papers and the receptiveness of authors to criticisms – the subjects of this study – as particularly ripe for investigation.

  3. Later sections of this document will use the concept of a “theoretical frame,” rather than “theory.” The distinction is developed later in the introduction.

  4. One of the limitations of this study, elaborated later, is that it does not examine manuscripts that were ultimately rejected but only those that were ultimately published.

  5. To be more precise, this study assumes that the changes provoked by journal peer review do not differ systematically from changes effected by other forces, e.g. informal comments in workshops and from colleagues.

  6. This approximation is further justified in the “Analytic strategy” section.

  7. Note, for example, that many ASA sections are topical. The discipline’s bibliometric structure also appears problem-centered (Moody and Light 2006).

  8. Some articles, especially the ASA versions, lacked one or more of the following sections. Other articles, especially the published versions, included separate discussion and conclusion sections or an additional section providing context (e.g. historical). Nevertheless, all articles largely followed this five-section structure.

  9. I used the Natural Language Processing Toolkit 2.0, available at http://www.nltk.org/.

  10. It is common in natural language processing to remove common words, often called stopwords, such as “the,” “was,” and the like.

  11. I also compared the sentences in an ASA article section to sentences only in the corresponding section of the published article, instead of its entirety. The results were qualitatively similar.

  12. The analysis of changes in variables and references uses a smaller number of article pairs (23) than the analysis of textual similarity (30) due to the increased difficulty of deep-reading the large number of pairs. The article pairs chosen for the more fine-grained analysis were those published most recently.

  13. The emphasis on data source is designed to capture the intuition that when an author uses a large dataset, e.g. Current Population Survey, and uses some of the data in one version of a paper but other data in a different version, the data has not substantively changed, because the author was likely aware of all of the data within CPS the entire time.

  14. In one article pair, a data source was used in the ASA but not the final version of the paper.

  15. The distinction between data source and variable is not clear when the variable is used only descriptively (i.e. not in a regression equation). The number of such cases in the corpus is relatively small.

  16. The authors indicated that they did not want the draft cited or circulated without permission.

  17. T-test of independent samples: t = 3.95, p-value < 0.001

  18. For example, Fox (1989: 189) suggests that the disunity of sociology will push reviewers to comment on those matters over which there is consensus, namely methods and data anlysis.

  19. It should be noted that the fungibility of references or variables need not be taken as an exogenous fact; it may be taken, as it is in this study, as a phenomenon to be explained.

References

  • Abbott, A. (1997). Seven types of ambiguity. Theory and Society, 26(2/3), 357–91.

    Article  Google Scholar 

  • Abbott, A. (2004). Methods of discovery: Heuristics for the social sciences. New York: W. W. Norton & Company.

  • Abbott, A. (2014). Digital paper: A manual for research and writing with library and internet materials. Chicago: University of Chicago Press.

    Book  Google Scholar 

  • Abend, G. (2006). Styles of sociological thought: sociologies, epistemologies, and the Mexican and U.S. Quests for truth*. Sociological Theory, 24(1), 1–41.

    Article  Google Scholar 

  • Abend, G. (2008). The meaning of ‘theory’*. Sociological Theory, 26(2), 173–99.

    Article  Google Scholar 

  • Bakanic, V., McPhail, C., & Simon, R. J. (1987). The manuscript review and decision-making process. American Sociological Review, 52(5), 631–42.

    Article  Google Scholar 

  • Bakanic, V., McPhail, C., & Simon, R. J. (1989). Mixed messages: referees’ comments on the manuscripts they review. The Sociological Quarterly, 30(4), 639–54.

    Article  Google Scholar 

  • Bazerman, C. (1988). Shaping written knowledge: The genre and activity of the experimental article in science (1st ed.). Madison: Univ of Wisconsin Pr.

    Google Scholar 

  • Bornmann, L. (2011). Scientific peer review. Annual Review of Information Science and Technology, 45(1), 197–245.

    Article  Google Scholar 

  • Bornmann, L., & Daniel, H.-D. (2005). Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of board of trustees&amp;#39; decisions. Scientometrics, 63(2), 297–320.

    Article  Google Scholar 

  • Camic, C., Gross, N., & Lamont, M. (2011). Social knowledge in the making (1st ed.). Chicago: University of Chicago Press.

    Book  Google Scholar 

  • Cole, S. (2001). What’s wrong with sociology? New Brunswick: Transaction Publishers.

    Google Scholar 

  • Cole, S., Cole, J. R., & Simon, G. A. (1981). Chance and consensus in peer review. Science, 214(4523), 881–86.

    Article  Google Scholar 

  • Dear, P. (1985). Totius in verba: rhetoric and authority in the early royal society. Isis, 76(2), 145–61.

    Article  Google Scholar 

  • Espeland, W. N., & Sauder, M. (2007). Rankings and reactivity: how public measures recreate social worlds. American Journal of Sociology, 113(1), 1–40.

    Article  Google Scholar 

  • Fox, M. F. (1989). Disciplinary fragmentation, peer review, and the publication process. The American Sociologist, 20(2), 188–91.

    Article  Google Scholar 

  • Gilbert, G. N. (1976). The transformation of research findings into scientific knowledge. Social Studies of Science, 6(3/4), 281–306.

    Article  Google Scholar 

  • Gilbert, G. N. (1977). Referencing as persuasion. Social Studies of Science, 7(1), 113–22.

    Article  Google Scholar 

  • Gilbert, G. N., & Mulkay, M. (1984). Opening pandora’s box: A sociological analysis of scientists’ discourse. Cambridge: Cambridge University Press.

    Google Scholar 

  • Glaser, B., & Strauss, A. (1999). The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine Transaction.

    Google Scholar 

  • Goodman, S. N., Berlin, J., Fletcher, S. W., & Fletcher, R. H. (1994). Manuscript quality before and after peer review and editing at annals of internal medicine. Annals of Internal Medicine, 121(1), 11–21.

    Article  Google Scholar 

  • Gross, A. G. (1990a). The rhetoric of science. Cambridge: Harvard University Press.

    Google Scholar 

  • Gross, A. G. (1990b). Persuasion and peer review in science: Habermas’s ideal speech situation applied. History of the Human Sciences, 3(2), 195–209.

    Article  Google Scholar 

  • Guetzkow, J., Lamont, M., & Mallard, G. (2004). What is originality in the humanities and the social sciences? American Sociological Review, 69(2), 190–212.

    Article  Google Scholar 

  • Gusfield, J. (1976). The literary rhetoric of science: comedy and pathos in drinking driver research. American Sociological Review, 41(1), 16–34.

    Article  Google Scholar 

  • Hargens, L. L. (1988). Scholarly consensus and journal rejection rates. American Sociological Review, 53(1), 139–51.

    Article  Google Scholar 

  • Harnad, S. (2000). The Invisible Hand of Peer Review. Exploit Interactive. Retrieved September 8, 2014 (http://cogprints.org/1646/).

  • Hirschauer, S. (2010). Editorial judgments: a praxeology of ‘voting’ in peer review. Social Studies of Science, 40(1), 71–103.

    Article  Google Scholar 

  • Huutoniemi, K. (2012). Communicating and compromising on disciplinary expertise in the peer review of research proposals. Social Studies of Science 0306312712458478.

  • Knorr, K. D. (1977). Producing and reproducing knowledge: descriptive or constructive? Toward a model of research production. Social Science Information, 16(6), 669–96.

    Article  Google Scholar 

  • Knorr, K. D. (1979). Tinkering toward success: prelude to a theory of scientific practice. Theory and Society, 8(3), 347–76.

    Article  Google Scholar 

  • knorr, K.D. & knorr, D. (1978). On the Relationship between Laboratory Research and Published Paper in Science. Retrieved November 3, 2014 (https://www.ihs.ac.at/publications/ihsfo/fo132.pdf).

  • Knorr-Cetina, K. (1981). The manufacture of knowledge : An essay on the constructivist and contextual nature of science. Oxford: Pergamon Press.

    Google Scholar 

  • Lamont, M. (1987). How to become a dominant french philosopher: the case of Jacques Derrida. American Journal of Sociology, 93(3), 584–622.

    Article  Google Scholar 

  • Lamont, M. (2009). How professors think : Inside the curious world of academic judgment. Cambridge: Harvard University Press.

    Book  Google Scholar 

  • Lamont, M. (2012). Toward a comparative sociology of valuation and evaluation. Annual Review of Sociology, 38(1), 201–221. doi:10.1146/annurev-soc-070308-120022.

  • Lamont, M., & G. Mallard. (2005). Peer review in international perspectives: US, UK and France. Report commissioned by the Social Sciences and Humanities Research Council of Canada.

  • Langfeldt, L. (2001). The decision-making constraints and processes of grant peer review, and their effects on the review outcome. Social Studies of Science, 31(6), 820–41.

    Article  Google Scholar 

  • Latour, B., & Woolgar, S. (1979). Laboratory life : The social construction of scientific facts. Beverly Hills: Sage Publications.

    Google Scholar 

  • Leahey, E. (2008). Methodological memes and mores: toward a sociology of social research. Annual Review of Sociology, 34(1), 33–53.

    Article  Google Scholar 

  • Lee, C. J., Sugimoto, C. R., Guo, Z., & Cronin, B. (2013). Bias in peer review. Journal of the American Society for Information Science and Technology, 64(1), 2–17.

    Article  Google Scholar 

  • Lynch, M., & Bogen, D. (1997). Sociology’s asociological ‘core’: an examination of textbook sociology in light of the sociology of scientific knowledge. American Sociological Review, 62(3), 481–93.

    Article  Google Scholar 

  • McCloskey, D. N. (1998). The rhetoric of economics (2nd ed.). Madison: University of Wisconsin Press.

    Google Scholar 

  • Medawar, P. (1964). Is the scientific paper fraudulent? Yes, it misrepresents scientific thought. Sunday Review, pp. 42–43.

  • Merton, R. K. (1968). The matthew effect in science: the reward and communication system of science. Science, 199, 55–63.

    Google Scholar 

  • Merton, R., & Zuckerman, H. (1971). Patterns of evaluation in science: institutionalisation, structure and functions of the referee system. Minerva, 9(1), 66–100.

    Article  Google Scholar 

  • Myers, G. (1985). Texts as knowledge claims: the social construction of two biology articles. Social Studies of Science, 15(4), 593–630.

    Article  Google Scholar 

  • Nelson, J. S. (1990). Rhetoric of the human sciences: Language and argument in scholarship and public affairs. Madison: University of Wisconsin Press.

    Google Scholar 

  • Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific Utopia II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7(6), 615–31.

    Article  Google Scholar 

  • Pfeffer, J., Leong, A., & Strehl, K. (1977). Paradigm development and particularism: journal publication in three scientific disciplines. Social Forces, 55(4), 938–51.

    Article  Google Scholar 

  • Rueschemeyer, D. (2009). Usable theory: Analytic tools for social and political research (1st ed.). Princeton: Princeton University Press.

    Google Scholar 

  • Shapin, S. (1984). Pump and circumstance: Robert Boyle’s literary technology. Social Studies of Science, 14(4), 481–520.

    Article  Google Scholar 

  • Simon, R. J., Bakanic, V., & McPhail, C. (1986). Who complains to journal editors and what happens*. Sociological Inquiry, 56(2), 259–71.

    Article  Google Scholar 

  • Wennerås, C., & Wold, A. (1997). Nepotism and sexism in peer-review. Nature, 387(6631), 341–43.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Misha Teplitskiy.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Teplitskiy, M. Frame Search and Re-search: How Quantitative Sociological Articles Change During Peer Review. Am Soc 47, 264–288 (2016). https://doi.org/10.1007/s12108-015-9288-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12108-015-9288-3

Keywords

Navigation