Will the Tide Lift All Boats? Examining the Equity Effects of Performance Funding Policies in U.S. Higher Education

Abstract

This study considers whether performance funding policies systematically tend to harm some types of institutions of higher education while helping others. Building on theories of deck stacking and institutional stratification, a formal theoretical model of the effects of performance funding policies on individual institutions is developed and discussed. We find two types of likely policy effects—one which serves to improve overall institutional performance and another which exacerbates unevenness among institutions in terms of quality. We then conduct an initial empirical test of our theory, analyzing a cross-sectional time-series dataset of colleges and universities in the U.S. Our findings are somewhat mixed. The adoption of performance funding policies appears to have the ability to boost overall average levels of degree production in some instances. However, performance funding 2.0 policies are also associated with larger variance in degree production rates. We find some evidence that 2.0 policies also have heterogeneous effects on graduation and retention rates, whereby the benefits of these policies disproportionately accrue to institutions already positioned to perform well.

This is a preview of subscription content, log in to check access.

Fig. 1

Notes

  1. 1.

    Equation (2) implies that all institutions receive an equal amount of resources (funding per student) in the absence of a performance funding policy. Our model can still accommodate variation in funding levels (due to reasons other than performance funding) if we consider such variation to be a part of the \( {x_{i,t}} \) term in Eq. (1) rather than a part of the resources term (\( {r_{i,t}} \)).

  2. 2.

    Some smaller states have very few institutions of higher education, which may lessen concerns that performance funding policies will exacerbate inequality among institutions. Instead, the main sources of inequality will probably be the K-12 education system or uneven distribution of resources within a university or college.

  3. 3.

    Because Eq. (3) assumes that the size of the effect of a performance funding policy on institutional behavior depends (linearly) on the strength of the performance-based component of the allocation formula (or the degree to which funding levels vary based on past performance, as expressed in \( \alpha \)), increasing the intensity of the performance funding policy (\( \alpha \)) will increase both \( \theta \) and \( \tau \). We do not emphasize this aspect of the model, however, and could have easily obtained Eq. (5) under the assumption that \( \tau \) is fixed for all performance funding policies regardless of the magnitude of \( \alpha \); under such an alternative derivation, \( \tau \) would be interpreted as a fixed response to the presence of any performance policy. Under the formulation presented in the manuscript, \( \tau \) represents the response to a one-unit increase in the intensity of a performance funding policy multiplied by the actual intensity of the performance funding policy that is in effect.

  4. 4.

    In year 1, the term \( \theta p_{i,t - 1}^* \) from Eq. (5) is ignored (set to zero) since there is no prior performance for the institutions.

  5. 5.

    Table 1 shows a 0.01 move in the median, but this is likely due to sampling error since the number of simulations is only 500.

  6. 6.

    We can also consider how the existence of fixed institutional characteristics that affect performance might influence the results of our simulations. See the online appendix for additional discussion in which we recognize both fixed and time-varying components of institutions in additional Monte Carlo simulations.

  7. 7.

    Out of 10,704 observations, 182 in our sample belong to schools that report financial data according to the Financial Accounting Standards Board conventions. As a robustness check, we tried dropping these 182 observations and still found substantively similar results to what are reported here.

  8. 8.

    By unbalanced, we mean that some years may include data on all 520 institutions, but many do not. Across all years, 520 institutions appear in the dataset at some point.

  9. 9.

    Specifically, the degree production variable Y was regressed on Yt−1 and then on Yt+1 in two separate regressions. Observations were dropped any time the residuals from both regressions were greater than 10 or both residuals were less than − 10. For the retention rate, which has a larger standard deviation, the same process was followed except that the thresholds were 15 and − 15. Observations were also dropped if the residual from the above regression was greater than 40 or if the reported retention rate was 0.

  10. 10.

    With many of our models, we tried a variety of specifications that included lags of various lengths, but no consistent evidence of lagged effects was obtained.

  11. 11.

    Sources disagree substantially regarding the appropriate coding of the performance funding policy variable for three states: Colorado, North Carolina, and South Dakota. Dropping these three states does not substantially alter the findings presented in Table 2.

  12. 12.

    While there is a theoretical ceiling in performance (a 100% retention rate), even high-performing institutions in our sample rarely come anywhere close to hitting this ceiling (the 99th percentile is 96% retention).

  13. 13.

    As a robustness check, we tried restricting the sample to include only institutions with a Barron’s selectivity rating of 3 or lower since almost all HBCUs have a selectivity rating of 3 or lower. The results with this restricted sample (not shown here but available upon request) are very similar to what we found with the full sample.

References

  1. Abernathy, S. (2007). No child left behind and the public schools. Ann Arbor, MI: University of Michigan Press.

    Google Scholar 

  2. Alexander, F. K. (2000). The changing face of accountability: Monitoring and assessing institutional performance in higher education. The Journal of Higher Education,71(4), 411–431.

    Google Scholar 

  3. Angrist, J. D., & Pischke, J. S. (2009). Mostly harmless econometrics: An empiricist’s companion. Princeton, NJ: Princeton University Press.

    Google Scholar 

  4. Balla, S. J. (1998). Administrative procedures and political control of the bureaucracy. American Political Science Review,92(3), 663–673.

    Article  Google Scholar 

  5. Behn, R. D. (2002). The psychological barriers to performance management: Or why isn’t everyone jumping on the performance-management bandwagon? Public Performance & Management Review,26(1), 5–25.

    Google Scholar 

  6. Bell, E., Fryar, A. H., & Hillman, N. (2018). When intuition misfires: a meta-analysis of research on performance-based funding in higher education. Cheltenham: Edward Elgar Publishing.

    Google Scholar 

  7. Berry, W. D., Ringquist, E. J., Fording, R. C., & Hanson, R. L. (1998). Measuring citizen and government ideology in the American states, 1960–93. American Journal of Political Science,42(1), 327–348.

    Article  Google Scholar 

  8. Cantwell, B., & Taylor, B. J. (2013). Global status, intra-institutional stratification and organizational segmentation: A time-dynamic tobit analysis of ARWU position among US universities. Minerva,51(2), 195–223.

    Article  Google Scholar 

  9. Carnevale, A. P., & Rose, S. J. (2003). Socioeconomic status, race/ethnicity, and selective college admissions. A Century Foundation Paper.

  10. Complete College America. (2016). “Performance Funding” Accessed September 21, 2016. http://completecollege.org/the-game-changers/#clickBoxBlue.

  11. Dadgar, M., & Trimble, M. J. (2015). Labor market returns to sub-baccalaureate credentials: How much does a community college degree or certificate pay? Educational Evaluation and Policy Analysis,37(4), 399–418.

    Article  Google Scholar 

  12. Darling-Hammond, L. (1994). Performance-based assessment and educational equity. Harvard Educational Review,64(1), 5–31.

    Article  Google Scholar 

  13. Deming, D. J., & Walters, C. R. (2017). The impact of price caps and spending cuts on US postsecondary attainment. No. w23736. National Bureau of Economic Research.

  14. Diamond, J. B., & Spillane, J. P. (2004). High stakes accountability in urban elementary schools: Challenging or reproducing inequality? Teachers College Record,106(6), 1145–1176.

    Article  Google Scholar 

  15. Dougherty, K. J., Jones, S. M., Lahr, H., Natow, R. S., Pheatt, L., & Reddy, V. (2014). Implementing performance funding in three leading states: Instruments, outcomes, obstacles, and unintended impacts. New York, NY: Community College Research Center Working Paper (74).

  16. Dougherty, K. J., Jones, S. M., Lahr, H., Natow, R. S., Pheatt, L., & Reddy, V. (2016). Looking inside the black box of performance funding for higher education: Policy instruments, organizational obstacles, and intended and unintended impacts. RSF: The Russell Sage Foundation Journal of the Social Sciences,2(1), 147–173.

    Article  Google Scholar 

  17. Dougherty, K. J., & Natow, R. S. (2015). The politics of performance funding for higher education: Origins, discontinuations, and transformations. Baltimore: JHU Press.

    Google Scholar 

  18. Dougherty, K. J., Natow, R. S., & Vega, B. E. (2012). Popular but unstable: Explaining why state performance funding systems in the United States often do not persist. Teachers College Record,114(3), n3.

    Google Scholar 

  19. Dougherty, K. J., & Reddy, V. (2011). The impacts of state performance funding systems on higher education institutions: Research literature review and policy recommendations. CCRC working paper No. 37. New York: Community College Research Center Columbia University.

    Google Scholar 

  20. Florida College System. (2018). “2016–17 Performance funding model.” https://www.floridacollegesystem.com/resources/performance_funding_2016-17.aspx.

  21. Fryar, A. H. (2011). The disparate impacts of accountability—Searching for causal mechanisms. Paper presented at the Public Management Research Conference, Syracuse, NY.

  22. Gándara, D., & Rutherford, A. (2018). Mitigating unintended impacts? The effects of premiums for underserved populations in performance-funding policies for higher education. Research in Higher Education,59(6), 681–703.

    Article  Google Scholar 

  23. Hagood, L. P. (2017). The financial benefits and burdens of performance funding. Doctoral dissertation. Athens, GA: University of Georgia.

    Google Scholar 

  24. Hambrick, D. C., & Mason, P. A. (1984). Upper echelons: The organization as a reflection of its top managers. Academy of Management Review,9(2), 193–206.

    Article  Google Scholar 

  25. Hicklin, A., & Meier, K. J. (2008). Race, structure, and state governments: The politics of higher education diversity. The Journal of Politics,70(3), 851–860.

    Article  Google Scholar 

  26. Hillman, N., & Corral, D. (2017). The equity implications of paying for performance in higher education. American Behavioral Scientist,61(14), 1757–1772.

    Article  Google Scholar 

  27. Hillman, N. W., Fryar, A. H., & Crespín-Trujillo, V. (2018). Evaluating the impact of performance funding in Ohio and Tennessee. American Educational Research Journal,55(1), 144–170.

    Article  Google Scholar 

  28. Hillman, N. W., Tandberg, D. A., & Fryar, A. H. (2015). Evaluating the impacts of “new” performance funding in higher education. Educational Evaluation and Policy Analysis,37(4), 501–519.

    Article  Google Scholar 

  29. Hillman, N. W., Tandberg, D. A., & Gross, J. P. (2014). Performance funding in higher education: Do financial incentives impact college completions? The Journal of Higher Education,85(6), 826–857.

    Article  Google Scholar 

  30. Jackson, C. K., Johnson, R. C., & Persico, C. (2016). The effects of school spending on educational and economic outcomes: Evidence from school finance reforms. Quarterly Journal of Economics,131, 157–218.

    Article  Google Scholar 

  31. Jones, T. (2016). A historical mission in the accountability era: A public HBCU and state performance funding. Educational Policy,30(7), 999–1041.

    Article  Google Scholar 

  32. Kelchen, R. (2018). Do performance-based funding policies affect underrepresented student enrollment? The Journal of Higher Education. https://doi.org/10.1080/00221546.2018.1434282.

    Article  Google Scholar 

  33. Kelchen, R., & Stedrak, L. J. (2016). Does performance-based funding affect colleges’ financial priorities? Journal of Education Finance,41(3), 302–321.

    Article  Google Scholar 

  34. Lafortune, J., Rothstein, J., & Schanzenbach, D. W. (2018). School finance reform and the distribution of student achievement. American Economic Journal: Applied Economics,10(2), 1–26.

    Google Scholar 

  35. Lahr, H. E., Pheatt, L. E., Dougherty, K. J., Jones, S. M., Natow, R. S., & Reddy, V. T. (2014). Unintended impacts of performance funding on community colleges and universities in three states (CCRC Working Paper No. 78). New York, NY: Community College Research Center. Retrieved from http://ccrc.tc.columbia.edu/Performance-Funding.html.

  36. Li, A. Y., Gándara, D., & Assalone, A. (2018). Equity or disparity: Do performance funding policies disadvantage 2-Year minority-serving institutions? Community College Review,46(3), 288–315.

    Article  Google Scholar 

  37. Long, B. T. (2016). State support for higher education: How changing the distribution of funds could improve college completion rates. The Miller Center. http://web1.millercenter.org/commissions/higher-ed/Long_No9.pdf.

  38. Lumina Foundation. (2016). “Frequently Asked Questions: Outcomes-Based Funding” Accessed September 21, 2016. https://www.luminafoundation.org/outcomes-based-funding-faq.

  39. McCubbins, M. D., Noll, R. G., & Weingast, B. R. (1987). Administrative procedures as instruments of political control. Journal of Law Economics and Organization,3(2), 243–277.

    Google Scholar 

  40. McLendon, M. K., Hearn, J. C., & Deaton, R. (2006). Called to account: Analyzing the origins and spread of state performance-accountability policies for higher education. Educational Evaluation and Policy Analysis,28(1), 1–24.

    Article  Google Scholar 

  41. Moynihan, D. P. (2008). The dynamics of performance management: Constructing information and reform. Washington, D.C.: Georgetown University Press.

    Google Scholar 

  42. National Conference of State Legislatures. (2015). “Performance-Based Funding for Higher Education.” Accessed June 5, 2016. Available at http://www.ncsl.org/research/education/performance-funding.aspx.

  43. Paulsen, M. B., & St. John, E. P. (2002). Social class and college costs: Examining the financial nexus between college choice and persistence. The Journal of Higher Education,73(2), 189–236.

    Google Scholar 

  44. Posselt, J. R., Jaquette, O., Bielby, R., & Bastedo, M. N. (2012). Access without equity longitudinal analyses of institutional stratification by race and ethnicity, 1972–2004. American Educational Research Journal,49(6), 1074–1111.

    Article  Google Scholar 

  45. Roscigno, V. J. (2000). Family/school inequality and African-American/Hispanic achievement. Social Problems,47(2), 266–290.

    Article  Google Scholar 

  46. Rutherford, A., & Meier, K. J. (2015). Managerial goals in a performance-driven system: theory and empirical tests in higher education. Public Administration,93(1), 17–33.

    Article  Google Scholar 

  47. Rutherford, A., & Rabovsky, T. (2014). Evaluating impacts of performance funding policies on student outcomes in higher education. The ANNALS of the American Academy of Political and Social Science,655(1), 185–208.

    Article  Google Scholar 

  48. Smallwood, S., & Richards, A. (2011). How educational are state legislators? Chronicle of Higher Education. Accessed April 10, 2018. Available at https://www.chronicle.com/article/How-Educated-Is-Your/127845.

  49. Spence, D. B. (1997). Agency policy making and political control: Modeling away the delegation problem. Journal of Public Administration Research and Theory,7(2), 199–219.

    Article  Google Scholar 

  50. Tam, M. (2001). Measuring quality and performance in higher education. Quality in higher Education,7(1), 47–54.

    Article  Google Scholar 

  51. Tandberg, D. A., & Hillman, N. W. (2014). State higher education performance funding: Data, outcomes, and policy implications. Journal of Education Finance,39(3), 222–243.

    Google Scholar 

  52. Umbricht, M. R., Fernandez, F., & Ortagus, J. C. (2015). An examination of the (un)intended consequences of performance funding in higher education. Educational Policy,31(5), 643–673.

    Article  Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Amanda Rutherford.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (DOCX 33 kb)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Favero, N., Rutherford, A. Will the Tide Lift All Boats? Examining the Equity Effects of Performance Funding Policies in U.S. Higher Education. Res High Educ 61, 1–25 (2020). https://doi.org/10.1007/s11162-019-09551-1

Download citation

Keywords

  • Performance funding
  • Equity
  • Performance
  • Formal model