Skip to main content

Direct Democracy, Educative Effects, and the (Mis)Measurement of Ballot Measure Awareness

Abstract

A century ago, Progressive reformers in the U.S. introduced the institutional innovations of direct democracy, claiming these reforms would cultivate better citizens. Two decades of high-profile research have supported and challenged the relationship between direct democracy, increased attention to politics, and a higher turnout rate. We propose, however, that a necessary condition of the “educative effects” model is voter familiarity with initiatives and referendums. While some research has examined ballot measure awareness, we suspect that that the standard measurements—e.g., “Have you heard of Proposition X?”—overestimate actual knowledge. Specifically, we measure ballot measure knowledge in a manner requiring voters to demonstrate familiarity with specific measures rather than merely asserting broad familiarity. Our approach reveals that the public’s awareness of statewide ballot measures, both in the abstract and with respect to particular measures, is far lower than past research suggests. Importantly, it also reveals that people with high levels of education, political interest, and knowledge of national politics are the most likely to misrepresent their ballot measure awareness.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3

Notes

  1. Political parties and other operatives have applied the educative effects of ballot measures to their own ends in recent election cycles, banking on the hope that specific ballot initiatives would improve the prospects of their preferred candidates. It has. See, for example, Nicholson (2005), Donovan et al. (2009), Smith and Tolbert (2010), and Kousser and McCubbins (2005).

  2. For an illuminating critique of the relationship between social scientists’ understanding of voter competence and the ways in which we measure it, see Lupia (2006).

  3. It is worth highlighting that what our study is fundamentally different from the work of Biggers (2011) and Seabrook et al. (2015). These two studies examine the degree to which the presence of direct democracy delivers on one of the educative effects hypothesis’s promises: a more informed electorate. To assess whether this is the case, they use measurements of general political knowledge. Here, we are concerned with a more foundational aspect of the educative effects hypothesis: Are voters aware of the presence of ballot measures that would, in turn, possibly empower direct democracy’s educative effects? In essence, focusing on general political knowledge is one of the potential outcomes of the educative effects hypothesis, whereas examining awareness of ballot measures constitutes a more fundamental assumption of the educative effects hypothesis.

  4. For a concise overview of initiative use by state since initial adoption, see Todd Donovan, Daniel A. Smith, Tracy Osborn, and Christopher Z. Mooney. 2013. “Figure 4.4: Historic Statewide Initiative Use” State and Local Politics: Institutions and Reforms 4th edition (Stamford: CT: Cengage), p. 119. Whether measured as a total or by average biennial use, Arkansas is in the top third of initiative users.

  5. The other two referred amendments—one making it more difficult to qualify initiatives for the ballot and the other requiring legislative approval before executive branch rules went into effect – generated some editorial comment but far less attention. See “Against Ballot Issue No. 1,” Arkansas Democrat-Gazette, 21 October 2014.

  6. For an overview of the legislative process that led to the amendment, see Noel Oman, “Panel Passes Ethics Amendment,” Arkansas Democrat-Gazette, 10 April 2013. Polling in advance of the election showed that when each element of the measure was explained, poll respondents overwhelmingly opposed the two additions. See “Poll: Term Limits Extension, Official Pay Hurting Ethics Reform Amendment,” Talk Business and Politics, http://talkbusiness.net/2013/10/poll-term-limits-extension-official-pay-hurting-ethics-reform-amendment/#sthash.eDHa4mbL.dpuf.

  7. Arkansas Term Limits, Arkansas Ethics Commission final campaign finance report, 8 December 2014.

  8. Jay Barth, “Minimum Wage Increase Could Help Democrats,” Arkansas Times, 10 April 2014, http://www.arktimes.com/arkansas/minimum-wage-increase-could-help-democrats/Content?oid=3263167. Arkansas Interfaith Alliance, Arkansas Ethics Commission final campaign finance report, 8 October 2014. Arkansas Ethics Commission campaign finance reports.

  9. It is important to note that a series of local alcohol sale legalization measures were in “dry” counties around the state that also expanded the conversation about alcohol sales in certain locales around the state. Arkansas Ethics Commission campaign finance reports.

  10. Max Brantley, “Arkansas Supreme Court Kills Amendments on New Casinos, Tort Reform and Approves Medical Marijuana Amendment,” Arkansas Blog, 13 October 2016.

  11. Brian Fanney, “Court Strikes Medical Marijuana Initiated Act; Issue 7 Votes Won’t Count, but Those on Rival Issue 6 Will,” Arkansas Democrat-Gazette, 28 October 2016.

  12. Arkansas Ethics Committee reports. Accessed at: http://www.arkansasethics.com/blqc.htm.

  13. In Arkansas’s closely-watched 2014 U.S. Senate race, for example, the Arkansas Poll predicted the Republican challenger would garner 57% of the vote; indeed, that was his actual vote share. In the 2016 contest, the Republican incumbent was predicted to garner 61%; he earned 60%. Further details are available at https://fulbright.uark.edu/departments/political-science/partners/arkansas-poll.php. Replication data are available at https://dataverse.harvard.edu/dataverse/polbehavior.

  14. Recent scholarship on the measurement of political knowledge rightly sounds the alarm about the widespread absence of intercoder reliability and/or coding transparency. Providing numerous predetermined categories into which interviewers can place responses is an imperfect, if cost effective, solution to the problems well articulated by DeBell (2013). It also however prevents post hoc reliability checks. Still, we believe we have addressed his most important admonition: “[I]t is important to develop and disclose coding rules that are as precise and comprehensive as possible. This does not eliminate subjectivity from the codes. Instead, it moves subjectivity from behind the curtain of the coder’s mind into the sunshine of peer review, in publicly disclosed precise rules for the coding” (p. 396).

  15. While it is possible some people are interested in one issue, other people in other issues and each does not recall certain measures because they do not plan to vote on them, an examination of roll-off rates rules out a role in our results: The roll-off rates in 2014 show little variation: Issues 1 and 2 experienced just 6 percent roll off, Issue 3 (ethics reform/term limits) was 4%, and alcohol and the minimum wage hike were just 2%.

  16. Unfortunately, our sample size on the 2016 survey was not large enough to conduct a statistically meaningful analysis of ballot measure reporting behavior.

  17. Existing work on the factors that either facilitate or impede voters’ capacity to participate meaningfully in direct legislation elections finds roles for education levels (Magleby 1989, Bowler and Donovan 1994), campaign contact (e.g., Bowler and Donovan 1998, Parry et al. 2012), party and elite endorsements (e.g., Boudreau and MacKenzie 2014, Burnett and Parry 2014), ballot language complexity (Reilly and Richey 2011), and multiple initiatives (e.g., Selb 2008). But the dependent variable in these studies is nearly always operationalized as either turnout (presumed to operate as a surrogate for ballot measure interest), or as self-reported — and unverified — awareness of the measure(s) at hand.

  18. We collapse the original measurement of education from a 7 category to a 5 category measure to increase the number of respondents at the tails of the education spectrum. Doing so allows us to have a large enough sample in each educational category to perform analysis that otherwise would have not enough respondents.

  19. Of course, it is possible these otherwise-well-equipped voters have not misrepresented themselves, but instead have enough experience with political participation to anticipate ballot measures in a generic—if not a specific—sense. While we cannot discern precisely which behavior is driving our finding, we suspect social desirability – i.e., wanting to feel like an attentive citizen—is at least partly responsible (Lodge and Taber 2013).

  20. When a specific ballot measure does activate individuals politically, perhaps it is only in unique circumstances—e.g., a special election when the ballot measure is the sole issue, a time when an individual has actually signed a petition to get the measure on the ballot (Parry et al. 2012), or a case when identity politics is at play (see, for example, Feig 2007).

References

  • Benz, M., & Stutzer, A. (2004). Are voters better informed when they have a larger say in politics? Public Choice, 119, 31–59.

    Google Scholar 

  • Berelson, B. R., Lazarsfeld, P. F., & McPhee, W. N. (1954). Voting: a study of opinion formulation in a presidential campaign. Chicago: University of Chicago Press.

    Google Scholar 

  • Biggers, D. R. (2011). When ballot issues matter: social issue ballot measures and their impact on turnout. Political Behavior, 33, 3–25.

    Google Scholar 

  • Biggers, D. R. (2012). Can a social issue proposition increase political knowledge? Campaign learning and the educative effects of direct democracy. American Politics Research, 40(6), 998–1025.

    Google Scholar 

  • Biggers, D. R. (2014). Morality at the ballot: Direct democracy and political engagement in the United States. New York: Cambridge University Press.

    Google Scholar 

  • Boehmke, F. J., & Michael Alvarez, R. (2014). The influence of initiative signature-gathering campaigns on political participation. Social Science Quarterly, 95, 165–183.

    Google Scholar 

  • Boudreau, C., & MacKenzie, S. A. (2014). Informing the electorate? How party cues and policy information affect public opinion about initiatives. American Journal of Political Science, 58, 48–62.

    Google Scholar 

  • Bowler, S., & Donovan, T. (1994). Information and opinion change on ballot propositions. Political Behavior, 16(4), 411–435.

    Google Scholar 

  • Bowler, S., & Donovan, T. (1998). Demanding choices: Opinion, voting, and direct democracy. Ann Arbor: University of Michigan Press.

    Google Scholar 

  • Bowler, S., & Donovan, T. (2002). Democracy, institutions, and attitudes about citizen influence on government. British Journal of Political Science, 32, 371–390.

    Google Scholar 

  • Burnett, C. M., & Kogan, V. (2012). Familiar choices: Reconsidering institutional effects of the direct initiative. State Politics and Policy Quarterly, 12, 204–224.

    Google Scholar 

  • Burnett, C. M., & Kogan, V. (2015). When does ballot language influence voter choices? Evidence from a survey experiment. Political Communication, 31, 109–126.

    Google Scholar 

  • Burnett, C. M., & Parry, J. A. (2014). Gubernatorial endorsements and ballot measure approval. State Politics and Policy Quarterly, 14, 178–195.

    Google Scholar 

  • Childers, M., & Binder, M. (2012). Engaged by the initiative? How the use of citizen initiatives increases voter turnout. Political Research Quarterly, 65, 93–103.

    Google Scholar 

  • DeBell, M. (2013). Harder than it looks: Coding political knowledge on the ANES”. Political Analysis, 21, 393–406.

    Google Scholar 

  • Donovan, T., Tolbert, C. J., & Smith, D. A. (2009). Political engagement, mobilization and direct democracy. Public Opinion Quarterly, 73, 98–118.

    Google Scholar 

  • Delli Carpini, M. X., & Keeter, S. (1993). Measuring political knowledge: Putting first things first. American Journal of Political Science, 37, 1179–1206.

    Google Scholar 

  • Delli Carpini, M. X., & Keeter, S. (1996). What Americans know about politics and why it matters. New Haven, CT: Yale University Press.

    Google Scholar 

  • Dyck, J., & Lascher, E. (2009). Direct democracy and political efficacy reconsidered. Political Behavior, 31, 401–427.

    Google Scholar 

  • Dyck, J. J., & Seabrook, N. R. (2010). Mobilized by direct democracy: short-term versus long-term effects and the geography of turnout in ballot measure elections. Social Science Quarterly, 91, 188–208.

    Google Scholar 

  • Everson, D. (1981). The effects of initiatives on voter turnout: A comparative state analysis. Western Political Quarterly, 34, 415–425.

    Google Scholar 

  • Feig, D. G. (2007). Race, roll-off, and the straight-ticket option. Politics & Policy, 35, 548–568.

    Google Scholar 

  • Geer, J. G. (1988). What do open-ended questions measure? Public Opinion Quarterly, 52, 565–571.

    Google Scholar 

  • Goebel, T. (2002). A government by the people: Direct democracy in America, 1890–1940. Chapel Hill, NC: University of North Carolina Press.

    Google Scholar 

  • Kousser, T., & McCubbins, M. (2005). Social choice, crypto-initiatives, and policymaking by direct democracy. Southern California Law Review, 78, 949–984.

    Google Scholar 

  • Lewis-Beck, M. S., Jacoby, W. G., Norpoth, H., & Weisberg, H. F. (2008). The American voter revisited. Ann Arbor: University of Michigan Press.

    Google Scholar 

  • Lodge, M., & Taber, C. S. (2013). The rationalizing voter. New York: Cambridge University Press.

    Google Scholar 

  • Lupia, A. (2006). How elitism undermines the study of voter competence. Critical Review, 18, 217–232.

    Google Scholar 

  • Lupia, A., & Matsusaka, J. (2004). Direct democracy: New approaches to old questions. Annual Review of Political Science, 7, 463–482.

    Google Scholar 

  • Lyons, J., Jaeger, W. P., & Wolak, J. (2012). The roots of citizens’ knowledge of state politics. State Politics & Policy Quarterly, 13, 183–202.

    Google Scholar 

  • Magleby, D. (1984). Direct legislation: Voting on ballot propositions in the United States. Baltimore: Johns Hopkins University Press.

    Google Scholar 

  • Mendelsohn, M., & Cutler, F. (2000). The effect of referendums on democratic citizens: Information, politicization, efficacy and tolerance. British Journal of Political Science, 30, 685–698.

    Google Scholar 

  • Mondak, J. J. (2001). Developing valid knowledge scales. American Journal of Political Science, 45, 224–238.

    Google Scholar 

  • Mondak, J. J., & Davis, B. C. (2001). Asked and answered: Knowledge levels when we will not take “don’t know” for an answer. Political Behavior, 23, 199–224.

    Google Scholar 

  • Nicholson, S. (2003). The political environment and ballot proposition awareness. American Journal of Political Science, 47, 403–410.

    Google Scholar 

  • Nicholson, S. (2005). Voting the agenda: Candidates elections and ballot propositions. Princeton, NJ: Princeton University Press.

    Google Scholar 

  • Oppenheimer, D., & Edwards, M. (2012). Democracy despite itself: Why a system that shouldn’t work at all works so well. Cambridge, MA: MIT Press.

    Google Scholar 

  • Parry, J. A., Smith, D. A., & Henry, S. (2012). The impact of petition signing on voter turnout. Political Behavior, 34, 117–136.

    Google Scholar 

  • Reilly, S., & Richey, S. (2011). Ballot question readability and roll-off: The impact of language complexity. Political Research Quarterly, 64, 59–67.

    Google Scholar 

  • Robison, J. (2015). Who knows? Question format and political knowledge. International Journal of Public Opinion Research, 27, 1–21.

    Google Scholar 

  • Schlozman, D., & Yohai, I. (2008). How initiatives don’t always make citizens: Ballot initiatives in the American States, 1978–2004. Political Behavior, 30, 469–489.

    Google Scholar 

  • Seabrook, N. R., Dyck, J. J., & Lascher, E. L. (2015). Do ballot initiatives increase general political knowledge? Political Behavior, 37, 279–307.

    Google Scholar 

  • Selb, P. (2008). Supersized votes: Ballot length, uncertainty, and choice in direct legislation elections. Public Choice, 135, 319–336.

    Google Scholar 

  • Smith, M. A. (2001). The contingent effects of ballot initiatives and candidate races on turnout. American Journal of Political Science, 45, 700–706.

    Google Scholar 

  • Smith, M. A. (2002). Ballot initiatives and the democratic citizen. The Journal of Politics, 64, 892–903.

    Google Scholar 

  • Smith, D. A., & Tolbert, C. (2004). Educated by initiative: The effects of direct democracy on citizens and political organizations. Ann Arbor: University of Michigan Press.

    Google Scholar 

  • Smith, D. A., & Tolbert, C. (2007). The instrumental and educative effects of ballot measures: Research on direct democracy in the American States. State Politics and Policy Quarterly, 7, 416–445.

    Google Scholar 

  • Smith, D. A., & Tolbert, C. (2010). Direct democracy, public opinion, and candidate choice. Public Opinion Quarterly, 74, 85–108.

    Google Scholar 

  • Tolbert, C. J., Grummel, J., & Smith, D. A. (2001). The effect of ballot initiatives on voter turnout in the American States. American Politics Research, 29, 625–648.

    Google Scholar 

  • Tolbert, C., McNeal, R. S., & Smith, D. A. (2003). Enhancing civic engagement: The effect of direct democracy on political participation and knowledge. State Politics and Policy Quarterly, 3, 23–41.

    Google Scholar 

  • Tolbert, C. J., & Smith, D. A. (2005). The educative effects of ballot initiatives on voter turnout. American Politics Research, 33, 283–309.

    Google Scholar 

Download references

Acknowledgements

The authors wish to thank the ever-helpful participants of the annual State Politics and Policy Conference for their suggestions, most especially Michael Binder and Daniel Biggers, as well as the manuscript’s anonymous reviewers. Any errors that remain are of course our own.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Janine Parry.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

  1. 1.

    In 2014, the battery opened with this for the full sample: “Moving from people to issues, do you happen to know if there will be any ballot measures—which are policy questions for voters to decide—on the November ballot?” YES, NO. If “yes,” interviewers moved to instruction 3. below.

  2. 2.

    In 2016, the battery opened, for half the sample, with question above and stopped at YES, NO. The other half heard instead: “From what you have heard or read, will voters in Arkansas this November be voting on any ballot initiatives, referendums, state constitutional amendments, or not?” YES, NO. If “yes,” interviewers moved to instruction 3. below.

  3. 3.

    Is there a ballot measure, or measures, in which you are particularly interested? Which one? Interviewers were instructed verbally in both years to be generous in placing responses into the following categories. Caller screens advised: DO NOT READ; ALLOW RESPONDENT TO INDICATE MULTIPLE RESPONSES) YES [Administrative Rule], YES [Direct Democracy], YES [Term Limits], YES [Alcohol], YES [Minimum Wage], YES [Other topic, Not on ballot], NO and YES [any of these: four year terms for county officials/single candidates automatically elected/defining “infamous crimes” that affect the eligibility of elected officials], YES [governor retains powers and duties even out of state], YES [economic development/job expansion/job creation bonds], YES [medical marijuana amendment, no grow-your-own, smaller # qualifying conditions], YES [medical marijuana/cannabis initiated act, allows grow-own, larger # qualifying conditions], YES [MEDICAL MARIJUANA GENERALLY], YES [casino gambling, gambling in certain counties, etc.], YES [tort reform, cap on damages in medical lawsuits, cap attorney fees], YES [Other topic, Not on ballot], NO.

  4. 4.

    How often do you pay attention to what’s going on in government and politics? ALWAYS (4), MOST OF THE TIME (3), ABOUT HALF OF THE TIME (2), SOME OF THE TIME (1), NEVER (0).

  5. 5.

    Political knowledge is an index of respondents’ answers to a three-question, rotated battery of national knowledge questions: Who is the speaker of the U.S. House of Representatives. I have a list of names here, is it (responses randomized): Harry Reid, John Kerry, John Boehner/Paul Ryan, Nancy Pelosi. What is the term of office for a U.S. Senator? Is it: Two years, four years, six years, eight years (responses inverted). Who is responsible for nominating judges to federal courts? Is it (responses randomized): The president, congress, the supreme court, state governors. Coded as percent correct.

  6. 6.

    Which of the following education categories best describes your highest level of schooling? I have a list here… NO HIGH SCHOOL (1), SOME HIGH SCHOOL (2), HIGH SCHOOL GRADUATE (3), SOME COLLEGE INCLUDING BUSINESS OR TRADE SCHOOL (4), COLLEGE GRADUATE (5), SOME GRADUATE SCHOOL (6), GRADUATE OR PROFESSIONAL DEGREE (7).

  7. 7.

    How would you describe your views on most political matters? Generally, do you think of yourself as liberal, moderate, or conservative? Coded as a binary variable: both strong conservatives and strong liberals (1), all others (0). strongly conservative or liberal (coded as “1”), and all others coded as “0.”

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Barth, J., Burnett, C.M. & Parry, J. Direct Democracy, Educative Effects, and the (Mis)Measurement of Ballot Measure Awareness. Polit Behav 42, 1015–1034 (2020). https://doi.org/10.1007/s11109-019-09529-w

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11109-019-09529-w

Keywords

  • Direct democracy
  • Ballot measures
  • Educative effects
  • Voter knowledge
  • Political behavior