The Scientific Research Output of U.S. Research Universities, 1980–2010: Continuing Dispersion, Increasing Concentration, or Stable Inequality?

Abstract

Extending and expanding Geiger and Feller’s (1995) analysis of increasing dispersion in R&D expenditures during the 1980s, the paper analyzes publication and citation counts as well as R&D expenditures for 194 top producers using Web of Science data. We find high and stable levels of inequality in the 1990s and 2000s, combined with robust growth both in the system and on individual campuses, considerable opportunities for short-range mobility and very limited opportunities for long-range mobility. Initial investments in research, private control, and the capacity of wealthy institutions to attract productive faculty are associated with high levels of scientific output. New entrants to the system and those that leave the system are both clustered near the bottom of the hierarchy.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3

Notes

  1. 1.

    While they represent an outcome of increasing interest to students of scientific research (see, e.g., Owen-Smith 2003), patents and licenses are produced at a fractional rate as compared to publications and citations. We do not include them in this paper.

  2. 2.

    The mergers included the Oregon Graduate Institute of Science and Technology into Oregon Health and Science University (OHSU) in 2001; the Medical College of Pennsylvania into MCP Hahnemann Medical College in 1993; MCP Hahnemann University with Drexel University College of Medicine in 2002; Hahnemann Medical School with Drexel University College of Medicine in 2003. The University of Maryland Baltimore Professional Schools publish as part of the University of Maryland, Baltimore. Finally, we were unable to ascertain the identity of the institution called Polytechnic University in G&F’s study.

  3. 3.

    These 12 institutions included the University of Illinois at Urbana-Champaign, the University of Massachusetts at Amherst, Missouri University of Science and Technology, New Mexico State University, the University of Texas Health Science Center at San Antonio, the University of Texas Medical Branch at Galveston, the University of Texas at Dallas, and the University of Texas M.D. Anderson Cancer Center, the University of Maryland Center for Environmental Science, the University of Texas Health Science Center at Houston, the University of Puerto Rico Mayaguez, and the Uniformed Services University of the Health Sciences.

  4. 4.

    Although the number of journals catalogued in WoS has grown over time, it is possible that researchers at lower-ranked institutions may find their research niches in more applied fields that are not included in WoS data.

  5. 5.

    WoS is somewhat less useful for those social science disciplines, such as sociology and political science, in which book publication is important. It is least useful for the humanities in which book publishing is central to the establishment of authors’ and institutions’ reputations.

  6. 6.

    Researchers who use fractional counting find that the number of papers per researcher is rising at a much lower rate. See, e.g., Fanelli (2010).

  7. 7.

    We consider the output of the institution as a whole to be a better measure than per capita output because one advantage of larger institutions is precisely that they produce in a wide variety of areas. This increases their visibility and prominence. Per capita growth and mobility tables provide a slightly different picture of the trajectory of the system as a whole and of individual institutions. Medical and engineering institutions fare somewhat better in per capita analyses, for example. A few small campuses, such as Rockefeller University, are highly productive on a per capita basis but the small size of their faculty has led to declining rank in output measures over time. The results of per capita analyses do in other respects tend to closely mirror those of the whole institution analyses that we use in the paper. Per capita results are available on request.

  8. 8.

    Inter-decile mobility can be measured in more than one way. An alternative measure, for example, would be to count any movement of 19 or more places as inter-decile mobility. Such a measure fails, however, to capture the concept of inter-decile movement accurately in our view, because the concept signifies location in one decile at time 1 and location in another decile during a later period. The exact location within the decile is immaterial.

  9. 9.

    Although we expected high levels of collinearity between institutional R&D and R&D quartile at the beginning of the period, in fact the correlation between the two variables (r = 0.30) was not too high to preclude the use of both. No variable in this study has a VIF of over 2.5.

  10. 10.

    Because student subsidy is mathematically derived, there were 16 cases of negative subsidy which we converted to 0 as no subsidy was offered at those institutions in those years.

  11. 11.

    Growth should not be equated uncritically with proportionate increases in quality or significance of research. As universities and government agencies have begun to measure publication and citation outputs more regularly, pressures have increased to adopt distorting publication tactics, such as cutting up larger and higher quality papers into small text units, a practice known as “salami publishing” or “least publishable units” (Fanelli 2010), as well as text recycling in multiple publications, also known as “self-plagiarism” (Necker 2014). On competitive pressures as a source of decline in scholarly reading practices, see also Abbott (2016). Similarly, it is possible to manipulate citations through “citation rings” in which inter-connected individuals make tacit agreements to boost each other’s careers through co-citation. Publications and citations remain the best measures of scientific outputs, but these adaptations to competitive pressures should be kept in mind as partial explanations for output growth.

  12. 12.

    We began this analysis in 1979–1980 and examined changes in R&D expenditures and publications in end years of the following decades 1989–1990, 1999–2000, and 2009–2010. For citations, we examined changes in 1989–1990 and 1999–2000 only because of foreshortened period for publications from 2010 have had only six years to accumulate citations, as compared to publications published in the earlier years.

  13. 13.

    The gains in publications and citations also reflect the more than threefold growth of journals in the WoS database between 1972 and 2010 (Larsen and von Ins 2010). The growth in the number of journals catalogued is itself a function, in part, of a larger and more productive university labor force, capable of sustaining many more high-quality journals.

  14. 14.

    We found a similar level of stability at the bottom of the hierarchy; approximately 20 universities consistently scored low across each of the measures and all four decades. These included several regional campuses (the University of South Alabama, the University of North Dakota-Grand Forks, the University of Alabama-Huntsville, and the University of North Texas), two California campuses more often thought of as teaching institutions (San Diego State University and San Jose State University), three former liberal arts colleges (the College of William and Mary, Ohio University, and Old Dominion), several struggling science and engineering oriented universities (the Missouri University of Science and Technology, the Tennessee Technological University, the State University of New York College of Environmental Science, and the New Mexico Institute of Mining and Technology), and two minority-serving institutions (Florida A&M and the University of Puerto Rico).

  15. 15.

    Upwardly-mobile campuses included Emory University (5th to 2nd decile in publications and citations), Arizona State University (5th to 3rd decile in publications; 6th to 4th decile in citations), the Georgia Institute of Technology (7th to 4th decile in publications and citations), and the University of South Florida (7th to 5th decile in publications and citations). In addition, the mobility opportunities of free-standing medical colleges were high during the study period in which the budgets of the National Institutes of Health were consistently three to five times larger than that of the National Science Foundation (AAAS 2016). Several of them, such as the University of Texas M.D. Anderson School of Medicine (6th to 3rd decile in publications; 5th to 2nd decile in citations), the Baylor College of Medicine (4th to 2nd decile in publications; 5th to 3rd decile in citations), and the Icahn School of Medicine at Mt. Sinai Hospital (4th to 3rd decile in citations) were among those experiencing inter-decile upward mobility during the period. By contrast, the University of Oregon (4th to 8th decile in publications; 3rd to 7th decile in citations), Temple University (4th to 6th decile in publications and citations), Rockefeller University (6th to 8th decile in publications; 2nd to 5th decile in citations), Brandeis University (7th to 9th decile in publications; 5th to 8th decile in citations), and Howard University (8th to 10th decile in publications and citations) were among the institutions experiencing notable downward mobility during the period at the institutional level.

  16. 16.

    A case can be made that continued dispersion would have encouraged a still more productive system, but the obvious counterfactuals to prove such a case are missing. Dispersion leveled off after 1990, but the rate of system productivity, as measured by publications and citations, nevertheless continued to grow robustly.

  17. 17.

    For a penetrating analysis of these processes in one discipline, see Burris (2004).

References

  1. Abbott, Andrew. 2016. The demography of scholarly reading. American Sociologist 47: 302–318.

    Article  Google Scholar 

  2. Adams, James D., Grant C. Black, J. Roger Clemmons, and Paula E. Stephan. 2005. Scientific teams and institutional collaborations: Evidence for U.S. universities, 1981–1999. Research Policy 34: 259–285.

    Article  Google Scholar 

  3. Adams, James D., and Zvi Griliches. 1998. Research productivity in a system of universities. Annales d’Economic et Statistique 49(50): 127–162.

    Article  Google Scholar 

  4. Allison, Paul. 2009. Fixed effects regression models: 160 (Quantitative Methods in the Social Sciences) 07–160. Los Angeles: SAGE Publications.

    Google Scholar 

  5. American Institutes of Research (AIR). 2014. Delta Cost Study Database. Washington, DC: AIR.

  6. American Association of University Professors (AAUP). 2015. Busting the myths: The annual report on the economic status of the profession. https://www.aaup.org/reports-publications/2014-15salarysurvey/.

  7. American Association for the Advancement of Science (AAAS). 2016. Trends in Research by Agency, FY 1976–2016. Washington, DC: AAAS.

  8. Brint, Steven, Kristopher Proctor, Scott Patrick Murphy, and Robert A. Hanneman. 2012. The Market Model and the Growth and Decline of Academic Fields in U.S. Four–Year Colleges and Universities, 1980–2000. Sociological Forum 27: 275–299.

  9. Burris, Val. 2004. The academic caste system: Prestige hierarchies in Ph.D. exchange networks. American Sociological Review 69: 239–264.

    Article  Google Scholar 

  10. Charlton, Bruce G., and Peter Andras. 2007. Evaluating universities using simple scientometric research output metrics: Total citation counts per university for a retrospective seven-year rolling sample. Science and Public Policy 34: 555–563.

    Article  Google Scholar 

  11. Desilver, Drew. 2013. Global inequality: How the U.S. compares. Pew Research Center Fact Tank (December 19).

  12. Dundar, Halil, and Darrell R. Lewis. 1998. Determinants of research productivity in higher education. Research in Higher Education 39(6): 607–631.

    Article  Google Scholar 

  13. Fanelli, Danielle. 2010. Do pressures to publish increase scientists’ bias? An empirical support from U.S. States data. PLOS One 5. doi: 10.1371/journal.pone.0010271. Accessed 04 July 2017.

  14. Geiger, Roger L., and Irwin Feller. 1995. The dispersion of academic research in the 1980s. Journal of Higher Education 66: 336–360.

    Article  Google Scholar 

  15. Geuna, Aldo, and Ben R. Martin. 2003. University research evaluation and funding: An international comparison. Minerva 41: 277–304.

  16. Gibbons, Michael, Camille Limoges, Helga Nowotny, Simon Schwartzman, Peter Scott, and Martin Trow. 1994. The new production of knowledge: The dynamics of science and research in contemporary societies. Thousand Oaks: Sage Publications.

    Google Scholar 

  17. Halaby, Charles N. 2004. Panel models in sociological research: Theory into practice. Annual Review of Sociology 30: 507–544.

    Article  Google Scholar 

  18. Halffman, Willem, and Loet Leydesdorff. 2010. Is Inequality Among Universities Increasing? Gini Coefficients and the Elusive Rise of Elite Universities. Minerva 48(1): 55–72.

    Article  Google Scholar 

  19. Hicks, Diana. 2012. Performance-based university research funding systems. Research Policy 41: 251–261.

    Article  Google Scholar 

  20. Hicks, Diana, and J. Sylvan Katz. 2011. Equity and Excellence in Research Funding. Minerva 49(2): 137–151.

    Article  Google Scholar 

  21. Jaquette, Ozan, and Edna Parra. 2016. The Problem with the Delta Cost Project Database. Research in Higher Education 57(5): 630–651.

    Article  Google Scholar 

  22. Javitz, Harold. 2006. Statistical analysis of publication trends in U.S. universities. Washington, DC: SRI International.

    Google Scholar 

  23. King, David A. 2004. The scientific wealth of nations. Nature 430: 311–316.

    Article  Google Scholar 

  24. Larsen, Peder Olesen, and Markus von Ins. 2010. The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index. Scientometrics 84(3): 575–603.

    Article  Google Scholar 

  25. Leydesdorff, Loet, and Caroline S. Wagner. 2009. Is the United States losing ground in science? A global perspective on the world science system. Scientometrics 78: 23–36.

  26. McNamee, Stephen J., and Cecil L. Willis. 1994. Stratification in science: A comparison of publication patterns in four disciplines. Science Communication 15: 396–416.

    Google Scholar 

  27. Merton, Robert K. 1968. The Matthew Effect in Science: The Reward and Communication Systems of Science Reconsidered. Science 59(3810): 56–63.

    Article  Google Scholar 

  28. National Science Board. 2014. Science and Engineering Indicators 2014. Washington, DC: National Science Foundation (NSB 14-01).

    Google Scholar 

  29. National Science Foundation (NSF). 2015. National Center for Science and Engineering Statistics Higher Education R&D Survey. Washington, DC: NSF.

  30. Necker, Sarah. 2014. Scientific misbehavior in economics. Research Policy 43: 1747–1759.

    Article  Google Scholar 

  31. Owen-Smith, Jason. 2003. From separate systems to hybrid order: Accumulative advantage across public and private science at Research One Universities. Research Policy 32: 1081–1104.

    Article  Google Scholar 

  32. Office of Technology Assessment. 1991. Federally funded research: Decision for a decade. Washington DC: U.S. Government Printing Office.

    Google Scholar 

  33. Pouris, Anastassios. 2007. The international performance of the South African academic institutions: A citation assessment. Higher Education 54: 501–509.

    Article  Google Scholar 

  34. Rogers, W.H. 1993. Regression standard errors in clustered samples. Stata Technical Bulletin 13: 19–23. (Reprinted in Stata Technical Bulletin Reprints, vol. 3, 88–94.).

    Google Scholar 

  35. Rosenzweig, Robert. 1992. Balancing National Research Capacity with its Support. In Science and Technology Policy Yearbook 1992, eds. Stephen D. Nelson, Kathleen M. Gramp, and Albert H. Teich, 205–208. Washington DC: American Association for the Advancement of Science.

    Google Scholar 

  36. Times Higher Education. 2011. Citation averages, 2000–2010, by fields and years. https://www.timeshighereducation.com/news/citation-averages-2000-2010-by-fields-and-years/415643.article/. Accessed 04 July 2017.

  37. Toutkoushian, Robert K., Stephen R. Porter, Cherry Danielson, and Paula R. Hollis. 2003. Using publication counts to measure an institution’s research productivity. Research in Higher Education 44: 121–148.

    Article  Google Scholar 

  38. U.S. House of Representatives, House Committee on Science, Space, and Technology. 1992. Report on the task force on the health of research. Washington DC: U.S. Government Printing Office.

    Google Scholar 

  39. U.S. News & World Report (USNWR). 2015 (Sept. 9). America’s best colleges. www.usnwr.com/best-colleges/.

  40. Uzzi, Brian, Satyam Mukherjee, Michael Stringer, and Ben Jones. 2013. Atypical Combinations and Scientific Impact. Science 342: 468–472.

  41. Ville, Simon, Abbas Valadkhani, and Martin O’Brien. 2006. Distribution of research performance across Australian universities, 1992–2003, and its implications for building diversity. Australian Economic Papers 45: 343–361.

    Article  Google Scholar 

  42. Winston, Gordon C. 1999. Subsidies, hierarchy and peers: The awkward economics of higher education. Journal of Economic Perspectives 13: 13–36.

    Article  Google Scholar 

  43. Winston, Gordon C. 2004. Differentiation among U.S. colleges and universities. Review of Industrial Organization 24: 331–354.

    Article  Google Scholar 

  44. Ziman, John. 1994. Prometheus bound: Science in a dynamic steady state. Cambridge: Cambridge University Press.

    Google Scholar 

Download references

Acknowledgments

We would like to thank Michaela Curran and Matthew C. Mahutga for consulting on the statistical modeling used in this paper. We would like to thank the anonymous reviewers for comments that improved the quality of the paper.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Steven Brint.

Appendix

Appendix

Table 3 Descriptive statistics

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Brint, S., Carr, C.E. The Scientific Research Output of U.S. Research Universities, 1980–2010: Continuing Dispersion, Increasing Concentration, or Stable Inequality?. Minerva 55, 435–457 (2017). https://doi.org/10.1007/s11024-017-9330-4

Download citation

Keywords

  • Higher education
  • Research productivity
  • Institutional stratification
  • Institutional mobility