The EAVS is an enormous endeavor, both for those administering it and for those responding to it. It asks hundreds of questions and generates hundreds of thousands of data points based on local jurisdiction-level data from all 50 states, the District of Columbia, American Samoa, Guam, Puerto Rico, and the US Virgin Islands.
In this section I will provide a picture, albeit incomplete, of some of what goes into implementing the EAVS. I will focus on five areas:
Survey approval and public comment;
Who administers the EAVS;
Who responds to the EAVS; and
Completing the survey and validating the data.
Through the Paperwork Reduction Act (PRA) the federal government requires that information collections like the EAVS be approved by the Office of Management and Budget. This is to ensure that undue burdens are not placed on the public—in the case of the EAVS the states—in responding to these types of requests from the federal government.
Although there is a touch of irony that the PRA involves completing some paperwork (electronically at least), it is an important reminder that asking states to respond to a complex and time-consuming survey is not to be undertaken lightly. Additionally, during this review process which can take four to six months, there are two mandated and incredibly helpful public comment periods. The survey questions are published and in 2016 the EAC received dozens of comments from concerned individuals, advocacy groups, as well as election officials.
Who Administers EAVS
The EAC administers the EAVS through a contractor hired via a competitive bid process. This has been the case for every EAVS since its inception in 2004. The contractor is the entity that sends out the survey to states, assists jurisdictions during the data collection process, provides analysis, and drafts reports. As I will discuss below, working with a contractor can provide a number of advantages including allowing the EAC to work with some of the foremost experts in the field as well as those who are experts in survey administration.
Another advantage to working with a contractor is the staff, time, and resources that a contractor can utilize. During my time at the EAC there was a staff of fewer than 30 people and of those only myself and one other staff member was focused on research and the EAVS. The EAC has many other responsibilities and manages to accomplish a great deal with a lean staff. However, two researchers are not enough to administer, manage, and provide all the assistance the states need when responding to a survey of this magnitude.
One of the key components of administering this survey is providing technical assistance to states at all stages of the survey. For the 2016 EAVS two examples of this assistance come to mind that were critical in successfully conducting the survey. First was an in-depth needs assessment of each state in the summer before the November election. This allowed the contractor to establish a working relationship with the appropriate points of contact and get specific information from the states on how they would respond to the survey and any particular needs or limitations they had.
Second, after the election when states were in the midst of responding to the survey the contractor had nearly ten technical assistants tasked to help states answer any questions they had about the survey. They responded to hundreds of inquiries and were able to assist states on a variety of issues, from technical questions about completing the spreadsheet, to larger questions about definition of terms in the survey.
Who Responds to the EAVS
The simple answer is that the states, the District of Columbia, American Samoa, Guam, Puerto Rico, and the US Virgin Islands respond to the survey. In reality, this varies a great deal from state to state. The EAVS gathers data from states at the jurisdictional level. In most states this is at the county level, and some is at the city or township level. States are responsible for collecting this data for all their jurisdictions.
Several states in 2016 were able to respond to a majority of the survey using information in their statewide election management systems—sometimes these are referred to as top-down states, where the information is gathered and to some extent controlled at the state level. These states often have much of the EAVS questions pre-programmed into their statewide management systems and can generate responses with relative ease. One challenge for top-down states is ensuring the data generated for the EAVS from their systems can be converted into the EAVS template for submission.
On the other end of the spectrum are bottom-up states where the survey questions need to be sent to each jurisdiction and then each jurisdiction needs to respond and send the survey back to the state. The states then need to combine all these jurisdictions’ data into the EAVS format and submit it in one file. This can present numerous challenges. In some cases states have little authority to ensure jurisdictions respond. And for jurisdictions not familiar with the survey it can be confusing. In these cases, the local jurisdictions often reach out for technical assistance. And in several jurisdictions, responses to the 2016 survey were completed by hand and sent by mail.
Completing and Validating the Survey
States respond to the EAVS using an Excel template. This Excel document includes a series of macros to help the respondent enter and review the data for errors. In 2016 another Excel template was created to allow for copying and pasting jurisdiction-level data. However, Excel is not the ideal mode to collect this amount of data from these many jurisdictions—more on that when I discuss potential improvements to the EAVS.
After states submit the data, they are reviewed for accuracy. This includes attempting to determine what any empty cells represent—data that are missing, data left blank because it is not applicable for a state (i.e. the state does not have Election Day registration and therefore has no data for this question), or a true value of zero. Or was it mistakenly left empty? Attempts are made to catch math errors as well. An example is finding an impossible value, such as a state showing it had more rejected provisional ballots than provisional ballots issued. And in 2016 for the first time more advanced statistical methods were used to see if the data fell between what would be expected of a jurisdiction based on data from previous years and the characteristics of the jurisdiction. Reports were generated and sent to each state flagging possibly problematic data and providing an opportunity for states to make changes or confirm the data is accurate.