Skip to main content

Learning Citizenship? How State Education Reforms Affect Parents’ Political Attitudes and Behavior


Over the past three decades, the states have adopted a suite of reforms to their education systems in an effort to improve school performance. While scholars have speculated about the political consequences of these policies, to date there has been no empirical research investigating how these reforms affect the practice of American democracy. Combining data from an original survey of public school parents with information on state education standards, testing, and accountability policies, I examine how design features of these policies influence parents’ attitudes about government, participation in politics, and involvement in their children’s education. My research shows that parents residing in states with more developed assessment systems express more negative attitudes about government and education, and are less likely to become engaged in some forms of involvement in their children’s education, than are parents who live in states with less developed assessment systems.

This is a preview of subscription content, access via your institution.


  1. 1.

    Other kinds of education accountability policies exist. For example, some states have school choice policies that allow students to enroll in a variety of different types of schools; while others have student accountability policies that require students to pass exit examinations as a condition of receiving a diploma. I focus on content standards, assessments, and school accountability policies in this paper because, unlike the other reforms described here, these three types of policies have been adopted in every state.

  2. 2.

    Of course, it is possible that these policies also affect the attitudes and behaviors of other groups, such a students, teachers, and the public at large. Future research should examine the scope and direction of the effects of school accountability policies on these other groups.

  3. 3.

    For the purposes of this article, I set aside the extremely fraught question of whether the actual impact of these policies on student achievement also affects parental attitudes and behaviors.

  4. 4.

    5.8 % were “unsure”.

  5. 5.

    Critically, because these policies are instituted at the state level (rather than locality by locality), the costs to parents and children of exit are very high. While parents routinely make decisions about where to live within a given community in part based on local school quality, they cannot easily move to another state if they are dissatisfied with state standards, testing, and accountability policies. Consequently, parents may feel “stuck” with state-instituted reforms they oppose, exacerbating feelings of disempowerment.

  6. 6.

    Polimetrix is regularly used by social science researchers to conduct public opinion surveys. For example, the firm conducts the surveys for the Cooperative Congressional Elections Studies project (Cooperative Congressional Elections Study 2012).

  7. 7.

    When verifying information is not provided, EPE staff attempt to locate adequate information from publicly accessible records.

  8. 8.

    For more information on the Common Core of Data, see

  9. 9.

    The overall inter-item correlation of the seven variables used to construct the Political Participation Index was 0.80 (Cronbach’s alpha), suggesting that the items are highly inter-correlated and can be combined into a single scale measuring respondents’ propensity to participate in politics in general.

  10. 10.

    Estimating separate models for each of the variables comprising the index also sheds light on whether a particular subset of items is influencing estimates of results for the index as a whole.

  11. 11.

    I thank an anonymous reviewer for the encouragement to use these measures in my analysis.

  12. 12.

    Importantly, granular knowledge about how standards-based reforms affect particular modes of parental involvement is consistent with broader trends in research on how parents participate in their children’s education. Whereas earlier research tended to clump together various different forms of parental involvement into broad indices, more recent research has focused greater attention on the causes and consequences of specific modes of involvement (e.g. Hill and Tyson 2009).

  13. 13.

    The 10th grade exam was not offered in 2011.

  14. 14.

    This was the most recent national election prior to the period (January 2012) during which the survey was administered.

  15. 15.

    Recent research (Arceneaux and Nickerson 2009; Angrist and Pitsche 2009) demonstrates that robust clustered standard errors are equally effective as more complex methods (random effects, HLM) in producing precise estimates of group-level effects when the number of clusters is above 20, as it is in my research.

  16. 16.

    The means and standard deviations of the variables constituting the Political Participation Index were: Contact Official (mean = 0.37, SD = 0.48); Attend Rally (mean = 0.13, SD = 0.35); Contribute to Campaign (mean = 0.17, SD = 0.37); Worked for Campaign (mean = 0.09, SD = 0.29); Wore Button (mean = 0.25, SD = 0.43); Joined Internet Group (mean = 0.11, SD = 0.32); Joined Political Organization (mean = 0.16, SD = 0.36).

  17. 17.

    The means and standard deviations of the variables constituting the Parental Involvement Index were: Contact Teacher (mean = 0.74, SD = 0.44); Attend Open House (mean = 0.76, SD = 0.43); Attend Parent-Teacher Association Meeting (mean = 0.44, SD = 0.50); Attend ParentTeacher Conference (mean = 0.71, SD = 0.45); Attend Parental Advisory Meeting (mean = 0.30, SD = 0.45); Attend School Event (mean = 0.71, SD = 0.45); Volunteer at School (mean = 0.33, SD = 0.47); Help at Fundraiser (mean = 0.46, SD = 0.50); and Attend School Board Meeting (mean = 0.27, SD = 0.45).

  18. 18.

    In the raw data, State Standards appears to have a modest negative effect on School Quality, but not the other dependent variables. However, when I matched the data on State Standards and re-estimated the models, the effect disappeared. Thus, I consider this result highly tentative. This analysis is available from the author on request.

  19. 19.

    Detailed information about post-matching balance is available from the author on request.

  20. 20.

    In an attempt to assess whether and to what extent the matched samples reflected or departed from the full sample, I compared the variable means for the treatment and control groups across the matched and full samples for each of the five matched datasets. I did not observe large differences across the two samples, though (as expected) overall balance across the treatment and control groups was better in the matched samples. However, as Imbens and Wooldridge (2008) note, it is possible that matched samples differ from full samples across unobserved characteristics, and such differences (by definition) cannot be evaluated. This is a primary reason why care must be taken in discussing the generalizability of results obtained from matched samples.

  21. 21.

    Notably, Rosenbaum sensitivity analysis makes the extremely strong assumptions that the unobserved covariate(s) “exhibit a strong, near perfect relationship with the response” (Rosenbaum 2002, p. 111) and are independent from the covariates used in matching, which are unlikely to hold in most practical applications (see also Mayer 2011, p. 642).

  22. 22.

    The results suggested that the state standards, assessments, and accountability policies had no effect on any of these variables, with the exceptions that School Accountability was positively associated with the probability that parents would report wearing a button (p < .05) and joining an internet-based political group (p < .05). To further investigate the relationship between School Accountability and the political participation variables, I employed genetic matching. I recoded School Accountability as a dichotomous variable with values 0/4 recoded as 0, and 5 recoded as 1 (I selected this cut-point because the average value of School Accountability was 4.04 and because this coarsening divided the sample almost exactly in half). Then, I genetically matched this coarsened measure on the other independent variables in each of the five datasets, and re-estimated the logistic regression models on the political participation variables in the genetically matched datasets. The results of these analyses provided no support for the hypothesis that School Accountability was positively associated with the probability of participating in these political activities. I concluded that the relationship between School Accountability and these participation variables was tenuous.

  23. 23.

    While the multivariate model for Attend Open House also appears indicate a statistically significant negative effect for State Assessments, I was unable to replicate this result using the rbounds command. Consequently, I consider this result more tentative. The discrepancy in this case may be due to the fact that the genetic matching procedure for my multivariate analysis followed best practice by matching with replacement, while proper implementation of sensitivity analysis with the rbounds command requires matching without replacement. Notably, the results for the other variables with statistically significant effects held regardless of whether matching was done with or without replacement, or whether the analysis was conducted with multivariate methods or using rbounds.

  24. 24.

    Full results are available from the author on request.


  1. Angrist, J., & Pitschke, J. S. (2009). Mostly harmless econometrics: An empiricist’s companion. Princeton: Princeton University Press.

    Google Scholar 

  2. Apple, M. (2006). Educating the right way: Markets, standards, god, and inequality (2nd ed.). New York: Routledge.

    Google Scholar 

  3. Arceneaux, K., & Nickerson, D. W. (2009). Modeling certainty with clustered data: A comparison of methods. Political Analysis, 17(2), 177–190.

    Article  Google Scholar 

  4. Banerjee, A. V., & Duflo, E. (2008). The Experimental Approach to Development Economics (No. w14467). National Bureau of Economic Research.

  5. Barksdale-Ladd, M. A., & Thomas, K. F. (2000). What’s at stake in high-stakes testing teachers and parents speak out. Journal of Teacher Education, 51(5), 384–397.

    Article  Google Scholar 

  6. Brookhart, S. M. (2013). The public understanding of assessment in education in the United States. Oxford Review of Education, 39(1), 52–71.

    Article  Google Scholar 

  7. Bruch, S., Ferree, M., & Soss, J. (2010). From policy to polity: Democracy, paternalism, and the incorporation of disadvantaged citizens. American Sociological Review, 75(2), 205–226.

    Article  Google Scholar 

  8. Bushaw, W. J., & Lopez, S. J. (2012). Public education in the United States: A Nation divided. In: The 44th annual phi delta kappa/gallup poll of the public’s attitudes toward the public schools. Washington, DC: Phi Delta Kappa.

  9. Campbell, A. (2002). Self-interest, social security, and the distinctive participation patterns of senior citizens. American Political Science Review, 96(3), 565–574.

    Article  Google Scholar 

  10. Campbell, A. (2003a). How policies make citizens: Senior political activism and the american welfare state. Princeton: Princeton University Press.

    Google Scholar 

  11. Campbell, A. (2003b). Participatory reactions to policy threats: Senior citizens and the defense of social security and medicare. Political Behavior, 25(1), 29–49.

    Article  Google Scholar 

  12. Citizens’ Commission on Civil Rights. (2001). Closing the deal: A preliminary report on state compliance with final assessment and accountability requirements under the improving America’s Schools Act of 1994. Washington, DC: Citizens’ Commission on Civil Rights.

    Google Scholar 

  13. Cohen, D. K. (1995). What is the system in systemic reform? Educational Researcher, 24(9), 11–31.

    Article  Google Scholar 

  14. Cooperative Congressional Elections Study. (2012). Available from the Cooperative Congressional Elections Study website. Retrieved July 1, 2012 from

  15. DeBray-Pelot, E., & McGuinn, P. (2009). The new politics of education: Analyzing the federal education policy landscape in the post-NCLB era. Educational Policy, 23(1), 15–42.

    Article  Google Scholar 

  16. Dee, T., & Jacob, B. (2011). The impact of no child left behind on student achievement. Journal of Policy Analysis and Management, 30(3), 418–446.

    Article  Google Scholar 

  17. Diamond, A., & Sekhon, J. (2013). Genetic matching for estimating causal effects: A general multivariate matching method for achieving balance in observational studies. Review of Economics and Statistics, 95(3), 932–945.

    Article  Google Scholar 

  18. Editorial Projects in Education. (2012). Available from Editorial Projects in Education, Custom Data Services. Retrieved June 10, 2012 from

  19. Education Week. (2012a). Quality counts 2012: The global challenge. Washington, DC: Education Week. Retrieved July 11, 2013 from

  20. Education Week. (2012b). Methodology: About the state policy survey. Retrieved July 11, 2013 from

  21. Fan, X., & Chen, M. (2001). Parental involvement and students’ academic achievement: A meta-analysis. Educational Psychology Review, 13(1), 1–22.

    Article  Google Scholar 

  22. Figlio, D. N. (2005). Testing, crime and punishment. Journal of Public Economics, 90(4), 837–851.

    Google Scholar 

  23. Figlio, D., & Loeb, S. (2011). School accountability. Handbook of the Economics of Education, 3, 383–421.

    Article  Google Scholar 

  24. Figlio, D., & Rouse, C. (2006). Do accountability and voucher threats improve low-performing schools? Journal of Public Economics, 90(1), 239–255.

    Article  Google Scholar 

  25. Figlio, D. N., & Winicki, J. (2005). Food for thought: The effects of school accountability plans on school nutrition. Journal of Public Economics, 89(2), 381–394.

    Article  Google Scholar 

  26. Fuhrman, S. H., & Massell, D. (1992). Issues and strategies in systemic reform. In: Consortium for Policy Research in Education, Eagleton Institute of Politics, Rutgers University. New Brunswick, NJ.

  27. Hanushek, E., & Raymond, M. (2005). Does school accountability lead to improved student performance? Journal of Policy Analysis and Management, 24(2), 297–327.

    Article  Google Scholar 

  28. Hetherington, M. (2005). Why trust matters. Princeton: Princeton University Press.

    Google Scholar 

  29. Hill, N. E., & Tyson, D. F. (2009). Parental involvement in middle school: A meta-analytic assessment of the strategies that promote achievement. Developmental Psychology, 45(3), 740–763.

    Article  Google Scholar 

  30. Ho, D., Imai, K., King, G., & Stuart, E. (2007). Matching as nonparametric preprocessing for reducing model dependence in parametric causal inference. Political Analysis, 15(3), 199–236.

    Article  Google Scholar 

  31. Ho, D., Imai, K., King, G., & Stuart, E. (2011). MatchIt: Nonparametric preprocessing for parametric causal analysis, June 28. Retrieved July 11, 2013 from

  32. Honaker, J., King, G., & Blackwell, M. (2010). Amelia II: A program for missing data. Version 1.2-18, November 4. Retrieved July 12, 2013 from

  33. Hong, S., & Ho, H. Z. (2005). Direct and indirect longitudinal effects of parental involvement on student achievement: Second-order latent growth modeling across ethnic groups. Journal of Educational Psychology, 97(1), 32–42.

    Article  Google Scholar 

  34. Hursh, D. (2007). Assessing no child left behind and the rise of neoliberal education policies. American Educational Research Journal, 44(3), 493–518.

    Article  Google Scholar 

  35. Imbens, G. M., & Wooldridge, J. M. (2008). Recent developments in the econometrics of program evaluation (No. w14251). National Bureau of Economic Research.

  36. Jacob, B. A. (2005). Accountability, incentives and behavior: The impact of high-stakes testing in the Chicago public schools. Journal of Public Economics, 89(5), 761–796.

    Article  Google Scholar 

  37. Jacob, B. A., & Levitt, S. D. (2003). Rotten apples: An investigation of the prevalence and predictors of teacher cheating. The Quarterly Journal of Economics, 118(3), 843–877.

    Article  Google Scholar 

  38. Jacobson, R., Saultz, A., & Snyder, J. W. (2013). When accountability strategies collide: Do policy changes that raise accountability standards also erode public satisfaction? Educational Policy, 27(2), 360–389.

    Article  Google Scholar 

  39. Jeynes, W. H. (2005). A meta-analysis of the relation of parental involvement to urban elementary school student academic achievement. Urban Education, 40(3), 237–269.

    Article  Google Scholar 

  40. Jolliffe, D., & Hedderman, C. (2012). Investigating the impact of custody on reoffending using propensity score matching. Crime and Delinquency,. doi:10.1177/0011128712466007.

    Google Scholar 

  41. Keele, L. (2009). rbounds: An R package for sensitivity analysis with matched data. Retrieved July 11, 2013 from

  42. Keele, L. (2010). An overview of rbounds: An R package for rosenbaum bounds sensitivity analysis with matched data. Retrieved July 11, 2013 from

  43. King, G., Honaker, J., Joseph, A., & Scheve, K. (2001). Analyzing incomplete political science data: An alternative algorithm for multiple imputation. American Political Science Review, 95(1), 49–69.

    Google Scholar 

  44. King, G., & Zeng, L. (2006). The dangers of extreme counterfactuals. Political Analysis, 14(2), 131–159.

    Article  Google Scholar 

  45. Krieg, J. M. (2008). Are students left behind? The distributional effects of the No Child Left Behind Act. Education, 3(2), 250–281.

    Google Scholar 

  46. Manna, P. (2006). School’s in: Federalism and the National Education Agenda. Washington, DC: Georgetown University Press.

    Google Scholar 

  47. Manna, P. (2010). Collision course: Federal education policy meets state and local realities. Washington, DC: CQ.

    Google Scholar 

  48. Mayer, A. K. (2011). Does education increase political participation? Journal of Politics, 73(3), 633–645.

    Article  Google Scholar 

  49. McDonald, M. (2013). United States Elections Project Database. Retrieved July 11, 2013 form

  50. McGuinn, P. (2006). No child left behind and the transformation of federal education policy, 1965–2005. Lawrence: University Press of Kansas.

    Google Scholar 

  51. Mehta, J. (2013). How paradigms create politics: The transformation of American Educational Policy, 1980–2001. American Educational Research Journal, 50(2), 285–324.

    Article  Google Scholar 

  52. Mettler, S. (2002). Bringing the state back into civic engagement: Policy feedback effects of the G.I. Bill for World War II veterans. American Political Science Review, 96(2), 351–365.

    Article  Google Scholar 

  53. Mettler, S. (2005). Soldiers to citizens: The G.I. Bill and the making of the greatest generation. Oxford: Oxford University Press.

    Google Scholar 

  54. Mettler, S., & Soss, J. (2004). The consequences of public policy for democratic citizenship: Bridging policy studies and mass politics. Perspectives on Politics, 2(1), 55–73.

    Article  Google Scholar 

  55. Mettler, S., & Stonecash, J. (2008). Government program usage and political voice. Social Science Quarterly, 89(2), 273–293.

    Article  Google Scholar 

  56. Mettler, S., & Welch, E. (2004). Civic generation: Policy feedback effects of the gi bill on political involvement over the life course. British Journal of Political Science, 34(3), 497–518.

    Article  Google Scholar 

  57. Mulvenon, S. W., Stegman, C. E., & Ritter, G. (2005). Test anxiety: A multifaceted study on the perceptions of teachers, principals, counselors, students, and parents. International Journal of Testing, 5(1), 37–61.

    Article  Google Scholar 

  58. National Assessment of Educational Progress. (2012). Accessed from the National Assessment of Educational Progress website. Retrieved June 10, 2012 from

  59. Nitta, K. (2008). The politics of structural education reform. New York: Routledge.

    Google Scholar 

  60. Piche, D. (1999). Title I in Alabama: The struggle to meet basic needs. Washington, DC: Citizens’ Commission on Civil Rights.

    Google Scholar 

  61. Pierson, P. (1993). When effect becomes cause: Policy feedback and political change. World Politics, 45(4), 595–628.

    Article  Google Scholar 

  62. Polikoff, M. S. (2012). Instructional alignment under no child left behind. American Journal of Education, 118(3), 341–368.

    Article  Google Scholar 

  63. Polikoff, M. S., Porter, A. C., & Smithson, J. (2011). How well aligned are state assessments of student achievement with state content standards? American Educational Research Journal, 48(4), 965–995.

    Article  Google Scholar 

  64. Program on Education Policy and Governance (PEPG). (2012). Survey of the public’s attitudes toward the public schools. Cambridge, MA: Program on Education Policy and Governance, Harvard University.

  65. Public Education Network. (2007). Open to the public: How communities, parents, and students assess the impact of the No Child Left Behind Act—2004–2007: The realities left behind. Washington, DC: Public Education Network.

    Google Scholar 

  66. Remillard, J. T., & Jackson, K. (2006). Old math, new math: Parents’ experiences with standards-based reform. Mathematical Thinking and Learning, 8(3), 231–259.

    Article  Google Scholar 

  67. Rhodes, J. (2012). An education in politics: The origin and evolution of no child left behind. Ithaca: Cornell University Press.

    Google Scholar 

  68. Rogers, M., & Stoneman, C. (1999). Triggering educational accountability. Washington, DC: Center for Law and Education.

    Google Scholar 

  69. Rosenbaum, P. R. (2002). Observational studies. New York: Springer.

    Book  Google Scholar 

  70. Rosenbaum, P. R. (2005). Observational study. In B. S. Everitt & D. C. Howell (Eds.), Encyclopedia of statistics in behavioral science (pp. 1451–1462). Chichester: Wiley.

    Google Scholar 

  71. Schneider, A. L., & Ingram, H. M. (1997). Policy design for democracy. Lawrence, KS: University Press of Kansas.

    Google Scholar 

  72. Segool, N. K., Carlson, J. S., Goforth, A. N., Embse, N., & Barterian, J. A. (2013). Heightened test anxiety among young children: Elementary school students’ anxious responses to high-stakes testing. Psychology in the Schools, 50(5), 489–499.

    Article  Google Scholar 

  73. Sekhon, J. (2011). Multivariate and propensity score matching software with automated balance optimization: The matching package for R. Journal of Statistical Software, 42(7), 1–52.

    Google Scholar 

  74. Sekhon, J., & Grieve, R. (2013) (N.d). A nonparametric matching method for covariate adjustment with application to economic evaluations. working paper. Retrieved July 11, 2013 from

  75. Smith, M. S., & O’Day, J. (1991). Systemic school reform. Journal of Education Policy, 5(5), 233–267.

    Article  Google Scholar 

  76. Soss, J. (1999). Lessons of welfare: Policy design, political learning, and political action. American Political Science Review, 93(2), 363–380.

    Article  Google Scholar 

  77. Soss, J. (2000). Unwanted claims: The politics of participation in the U.S. welfare system. Ann Arbor: University of Michigan Press.

    Google Scholar 

  78. Soss, J. (2005). Making clients and citizens welfare policy as a source of status, belief, and action. In A. Schneider & H. Ingram (Eds.), Deserving and entitled: Social constructions and public policy (pp. 291–328). New York: SUNY.

    Google Scholar 

  79. Stuart, E. (2010). Matching methods for causal inference: A review and a look forward. Statistical Science, 25(1), 1–21.

    Article  Google Scholar 

  80. Supovitz, J. A., & Taylor, B. S. (2005). Systemic education evaluation: Evaluating the impact of systemwide reform in education. American Journal of Evaluation, 26(2), 204–230.

    Article  Google Scholar 

  81. Tomz, M., Wittenberg, J., & King, G. (2003). Clarify: Software for interpreting and presenting statistical results. Retrieved July 11, 2013 from

  82. Tourangeau, K., Nord, C., Le, T., Sorongon, A. G., & Najarian, M. (2010). Combined user’s manual for the ECLS-K eighth-grade and K-8 full sample data files. Washington, DC: National Center for Educational Statistics, U.S. Department of Education.

    Google Scholar 

  83. U.S. General Accountability Office. (2000). Title I program: Stronger accountability needed for performance of disadvantaged students. Washington, DC: U.S. General Accounting Office.

  84. Weaver, V., & Lerman, A. (2010). Political consequences of the carceral state. American Political Science Review, 104(4), 817–833.

    Article  Google Scholar 

  85. Weick, K. E. (1976). Educational organizations as loosely coupled systems. Administrative Science Quarterly, 21(1), 1–19.

    Article  Google Scholar 

Download references


This research was supported with a grant from the Spencer Foundation. An earlier version of this article was prepared for presentation at the 2012 American Political Science Association conference. The author thanks Christopher Howard, Suzanne Mettler, Tatishe Nteta, Steven Teles, and the anonymous reviewers for generous comments on earlier versions; the staff of YouGov/Polimetrix, especially Ashley Grosse, for invaluable assistance in constructing, testing, and fielding the survey; and Jeffrey Mondak and Thomas Rudolph for their editorial guidance and support.

Author information



Corresponding author

Correspondence to Jesse H. Rhodes.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Rhodes, J.H. Learning Citizenship? How State Education Reforms Affect Parents’ Political Attitudes and Behavior. Polit Behav 37, 181–220 (2015).

Download citation


  • Policy feedback
  • Interpretive effects
  • School accountability
  • Education policies
  • Citizenship
  • Standards
  • Testing
  • Accountability