Creating an Exit Poll Ballot

This is part 3 of my “How-to-Run-an-Exit-Poll-Series.

The exit poll survey ballot is important, but not complicated. The only question of interest, other than their ballot choices, is the method of voting.   Data will be available at the end of the day with separate totals for the machine cast votes and the scanned paper votes.  There will be no official count of provisional votes at this station, so we can only compare those votes to overall total for the polling station. But that comparison allows us to evaluate whether the giving people provisional votes amounts to a voter suppression tactic.

Since space on our survey form is at a premium, and because including that information makes their response less anonymous, I do not recommend including questions about age, race or gender.  Generally speaking, you want to keep the words to minimum.  (Not an easy task for me.)

Here is an example survey I have developed for exit polls in Sedgwick Co.  I included a short paragraph at the top because I feel it’s important to let people know why you want this information and reassure them that results are anonymous, just like their vote.

Sample Exit Poll Survey

The first question is really too long, but I wanted to be as clear as I can about this question.  In Sedgwick County Kansas there are three possible options:  A vote cast via electronic voting machine, a paper ballot that the voter feeds into a scanner for on-site electronic counting or a provisional ballot – a paper ballot that is sealed into an envelop to be counted later (maybe).

Asking about the specific races is straightforward.  State the office and then list the candidates.  Circling answers reduces the need for a blank or box to check.  It saves space on the page.

Staggering the answers for questions with more than one line of answers (ex: Pres) makes it easier to discern the voters intent.  When they are stacked one above another, the answer may easily become ambiguous.

Since a single polling location will have multiple precincts voting there, it’s a problem asking about races where different precincts will be voting  for different candidates.   Generally, I want to confine the questions to races that will appear on every ballot at the polling location.   On the other hand, my site managers for the SW Wichita location are very interested in the county commissioner races.  We arrived at the following:

Who did you vote for your County Commissioner Race? (Select one for District 2 OR  District 3)   –  sw-wichita-nov-8-exit-poll-ballot

I have hopes that we won’t get too many voters identifying their choices for both district 2 and district 3, but I expect we will get some.   OTOH, it’s the only question that would be spoiled and I’m reasonably comfortable in assuming that such mishaps are equally likely to occur regardless of which candidate they support.  I think we will get good data from this exit poll.

How to Run an Exit Poll Part 1

How to Run an Exit Poll Part 2



How to Run an Exit Poll Part 2

Here are more instructions with links to help people who want to run an exit poll.  This is all pre-election day stuff to do but doesn’t include developing your sample survey form.  I’m going to devote a separate post to that.  I’ll link to it when it’s up.  I’m also planning an excel template that people can download, input their exit poll results and official polling location results to get an output of the probability of the difference between the occurring by chance alone.

Training:  I highly recommend running a short training session with your volunteers.  I think 30 minutes is sufficient, an hour if it’s a large group.   Go over the  instructions,  let them see the supplies (ballot box, survey forms, etc.), and have each participant go through some practice interactions with their fellow students.  It also gives everyone a chance to get acquainted prior to election day.

Schedule:  You’ll need to prepare a schedule in advance and co-ordinate with the volunteers to make sure they can and will be there for their appointed hours.   Here is the example-volunteer-schedule I used in my August primary exit poll.  Notice that some volunteers are tasked with bringing in fresh supplies.  Everyone got a copy of this ahead of time, so they knew when they were working and with whom.  Always have at least two volunteers manning the exit poll.  This is a basic safety precaution.  If one volunteer has a heart attack, the other will notice and call 911.

Tally Sheet:  You need a form for volunteers to record the number of refusals. Here is what I used in my August primary exit poll.  The time categories related to the hours different volunteers worked.  I taped a copy of this to the table and had the volunteers record their refusals.  Some volunteers kept separate tallies for themselves when it was busy and then later recorded them onto that page, which is also fine.

Supplies:  You’ll need to arrange for all supplies to be on site, including refreshments.  I had different volunteers bring refreshments through out the day. Suggested Supply List

Costs:   While many of the supplies are things you may already own or can be borrowed at no cost, printing hundreds of survey copies to be filled out will cost money.  I am estimating costs for the sites I am coordinating at $50 – $100 per site, including refreshments.

Another option is an electronic app.   I’m not using it for my sites because the advantage of voter marked hand-counted paper surveys is the same as for paper ballots.  On the other hand, I have donations sufficient to fund the needed supplies for exit polls I am coordinating, so my out-of-pocket costs are minimal.  If costs are a constraint, this is an acceptable alternative for the 2016 election.

 Permission Slip:  It’s not strictly required, but in addition to notifying the appropriate person regarding setting up your exit poll at their location, having documentation of permission by that person isn’t a bad idea.  I phoned and introduced myself, sent the person an email, and picked up the permission-slip in person weeks ahead of time.  I kept it on site at the exit poll.  No one confronted me about my right to be there, so it wasn’t needed.  But if someone does, it’s nice to have it.

How to run an Election Exit Poll

I’m working to set up multiple exit polls here in Kansas in November.  I thought that people in other areas might be interested in setting one up themselves. You need just one dedicated individual along with a few additional volunteers working a few hours apiece can pull this off.  You may also need to expend from $50 to $150 on supplies like making copies of your surveys and refreshments to offer voters.

The dedicated individual is the exit poll site manager.  The additional volunteers only need to spend a few hours on election day soliciting voters to complete surveys.  Two people should be manning the exit poll booth at all times, just in case any emergency situations arise.  This post outlines what an exit poll site manager needs to do to run a successful exit poll.

The approach I recommend is called cluster sampling.  Each site provides an independent check on the accuracy of the official counts at that polling station. If we combine information from multiple sites, the data can provide excellent precision for determining whether the discrepancies found are reasonable and evenly distributed or if they show evidence of systemic bias, which would indicate problems with our votes having been counted accurately.

This approach means that you need to try to contact all voters at that location to request they complete an exit poll survey.  The reason for this is that the purpose of this exit poll is to validate the official results at that location.   It is not to make predictions prior to the close of polls.  It is not to analyze for demographic information afterwards.  It is to validate the official results.  By concentrating our efforts at relatively few polling stations, we can attain a higher level of confidence that the results of our survey are representative of the polling location.

The first thing to ask is method they voted.  In my location, there are three options – by machine, by a scanner paper ballot, or by provisional ballot.  At the end of the day, I get the counts of votes cast for each candidate by machine and my scanned paper ballot.  Provisional votes are not counted until a determination is made of whether that person was allowed to cast the ballot. The exit poll results for provisional ballots can be compared to the accepted ballots.  If significant deviations occur, that is a measure of the impact of voter suppression attempts, such as voter ID laws.

Before finalizing your design of the exit poll survey, you will need to select a location.  Contact your local elections office and get a list of all the polling locations.  Let them know you are planning a citizens exit poll and ask if they have any regulations or laws that would affect that.  In Sedgwick County, the only significant rule was that we could only approach people after they voted, not before.   There is also a law regarding distance for any electioneering, but as long as we only approach exiting voters, this would not be a concern.

Site Manager Duties:

The site manager is the point person for everything to do with an exit poll at polling place.  They will be there in the morning to set everything up and they will wait at the polling location to get the official results when the machines have finished printing out the records.  They will not need to be there the entire day, but they do need to be available if any problems arise.

Prior to Election Day:

  1. Select polling location:  Site managers need to consider the polling places available near their home, perhaps even scout the locations to make a determination about which they would prefer to exit poll.  You will need to contact the owner/manager of that location and inform them of what you will be doing.  Find out if they have any concerns and address them or refer them back to me.  Obtain written permission to be on their premises to conduct this survey when appropriate.
  2.  Finalize Survey:  While there will be three races applicable to all of Sedgwick County, because site managers decide what locations they will monitor, they have the option to add additional questions specific to their polling place such as state legislators or judges.
  3. Prepare Supplies:  The site manager will decide on and arrange for all supplies to be there.  Tables, Chairs, Refreshments, survey forms, ballot box, etc. Suggested Supply List
  4. Schedule volunteers:  We’ll need to meet together to accomplish this. I’ll keep a list of volunteers and we can discuss where volunteers are needed and when.  I’m also going to see if I can get some student help for the times before and after school, which is often the busy times at polling locations as well.

Election Day – It’s a long day, but this could be split up between two people, say a morning manager and an evening manager.   

  1. Set up:  The site manager arrives half an hour before the polling station opens.  They set up the exit poll booth, making sure everything is ready for the first voter of the day.
  2. Maintain:  The site manager is the person on call for any issues that arise.  But they should be close by, available to take care of whatever issue might arise.  Run out of survey forms?  The site manager will bring more.  Volunteer calls in that they can’t make it after all? The site manager either fills in or finds someone there who can.
  3. Close down:  The site manager will be responsible for securing the completed survey forms and counting them.  The site manager will ensure the booth area is cleaned up and all borrowed equipment is returned.
  4. Official Results:  The site manager will need to remain on the premises to collect the official results for that polling location.  Meet with election officials for your location sometime in the morning, letting them know you will doing this.  They should allow you to examine the results tape for yourself.  However, if they object, you can ask them to fill out your survey form for the machine results and the scanned paper ballots results from the printouts.  In addition, ask them to give you the total number of provisional ballots turned in for that location.



After Election Day:

  1. Count your results
  2. Publicly post both the official results and your exit poll results for your polling location or email me your results and I’ll post them on my site


Exit poll went well; no significant signs of election fraud.

With help from nearly a dozen volunteers, I conducted an exit poll on one polling location during this primary. It even made the local newspaper.  I am quite pleased with the results; everything went smoothly.

It was primarily meant to be trial run for the Nov. election, making sure that I will be able to collect the data necessary then to identify problems with our machine counts.  While some mistakes were made (all by me – the volunteers were fantastic!), I feel confident that we will be able to accomplish that task in Nov.

I know that many people are interested in the results of this survey.  Overall, things looked good.  There were a couple of yellow flags, but nothing I would recommend taking action on.

Data Collected: The primary question I asked was how the individual had voted, by machine, with a scanned paper ballot, or with a paper provisional ballot: Aug 2 Exit Poll Ballot

The exit poll was conducted at one polling location with survey responses being compared to the machine tabulated results at the polling location. Respondents were asked how they voted, by machine, or a scanned paper ballot or a provisional paper ballot. Results are shown below. Due to the small number of paper ballots, both scanned and provisional, analysis results are shown for the machine tallies and for the totals for the polling location, but not for the paper ballots separately.  The count of votes counted and survey collected is shown below in table 1.

Table 1:

Analysis Table 1


There is a discrepancy between the official count for provision ballots (1) and the exit poll count (3).  This is likely due to errors in marking the exit poll, so I am not concerned about this discrepancy.

There were an additional 47 surveys collected that were unusable due to problems that ranged from being completely blank to responses filled in for all races, both Dem and Rep

We asked about six races with two candidates, three races in each party.  However, only three of those races were applicable to everyone who voted at that location.  There were multiple (5) precincts voting at the polling location. Three of the races asked about were limited to voters in only one or two of those precincts.  As a result, survey takers could indicate a choice in those three races even if they did not actually vote on them.  For that reason, I have labeled the data collected on those three races  as ‘questionable’.   Caution should be used in drawing conclusions from the exit poll data for those races.

The results for the six races are as follows with the winners names bolded in table 2.

Table 2:

Analysis Table 2

Assuming that the official results were accurate, I computed the probability of our exit poll results using the binomial distribution.  I rated those results as being Green (looks good), Yellow (suspicious but not conclusive) or Red (definitely something wrong).  The usual threshold for statistical significance is below 5%. There were no red flags, but two of the six races got a yellow caution rating.   These results are shown in Table 3.

Table 3:

Analysis Table 3

Races that all survey respondents voted on were the U.S. Senate (Dem and Rep) and the U.S. Rep (Dem).  Results for the losing candidates are shown in Figure 1.


Figure 1:

Analysis Figure 2

The Senate Race for Dem candidates is given a yellow warning because the probability of the differences between the official results and our exit poll is only 3%.  This is not considered a red flag because we are making 12 different comparisons, which needs to be included in assessing the results. For example, if 12 comparisons are made using a 5% threshold, there is a 45.96% probability of at least one of them falling below that threshold by random chance.  There’s a whole set of statistical techniques designed to account for multiple comparisons if I wanted to get really precise about it.  In addition, while the official votes skewed towards Ms. Singh, she lost the statewide election so even if there was manipulation, it would not have affected the outcome of the race.

We had no method to identify what precinct people were in, so for the Kansas House and Senate races, survey takers could vote for someone who was not on their precinct’s ballot.  For this reason, the exit poll data must be considered questionable.   On the Republican side, since no precinct voted on both the house and senate races, the 38 surveys with both those races marked were not included in the totals for those two races.  Results for the losing candidates of these races are shown in Figure 2.

Figure 2:

Analysis Figure 1

The official results for Kansas Rep. Dist. 87 race get a yellow rating.  The results were skewed towards Mr. Alessi with only around a 1% chance of occurring by random chance.  This is not rated as red because the exit poll data was questionable.  However, since Mr. Alessi lost the election, even if there was manipulation, it would not have affected the outcome of the race.

A Replication of My Work.

Mr. Brian Amos, Ph.D. candidate at the University of Florida was dedicated enough to replicate some of my work and acknowledge that he gets the same results I reported.

He does have a few disagreements with my approach. For example, what he describes as a nitpick, I would respond with: That’s a feature, not a bug! My choice of limiting an analysis to the precincts with more than 500 votes cast results in what he considers an overemphasis on the effect I’m am concerned with. This is absolutely true. That particular analysis was designed to draw out that effect and make it more apparent. The vote share data is very noisy and impacted by many different factors. The trend is real, but is easily missed in the inherent noise of the larger dataset.

Wichita 2014 Election Results
Wichita 2014 Election Results

Mr. Ames wonders if some other, correlated factor such as the voter registration numbers, would display a similar trend in the cumulative chart. He shows this is true for the share of Republicans in this particular data set. But this is not a universally correlated trait across the different states where such trends have been found, and it was not enough in Sedgwick County Kansas to account for the difference in vote share.

I discuss this factor at more length in my recently published paper “Audits of Paper Records to Verify Electronic Voting Machine Tabulated Results” in the Summer 2016 issue of The Kansas Journal of Law and Public Policy. The graph displayed above is from that paper, illustrating that although there is an upswing the cumulative graph for share of Republicans, it is much smaller than the upward surge of the vote share for various republican candidates in 2014.

His parting comment “While the charts may be explainable through vote fraud, there are other, perfectly innocuous explanations that can be put forward, as well.” is true. Yes, there are other possible and innocuous explanations. Statistical analysis only illuminates correlations and other relationships. Further investigation is needed to determine cause. Just because the trend is a predicted sign of election fraud does not mean election fraud occurred.

The only way to tell if our machine tabulated vote count is accurate or undermined is to conduct a proper audit. That’s never been done here in Sedgwick County. I’ve requested access to do this as a voter and been denied. I filed the proper paperwork in a timely manner asking for a recount of those records after the 2014 election and was denied. I’ve sued for access as an academic researcher and been denied.

Why should I trust a vote count that our officials will not allow to be publicly verified? Why should anyone?