Home | Search | Contact | ANES Home
2000 Florida Ballots Project
Overview
General Information
Project Sponsors
Methodology
Ballot Types
Frequently Asked Questions
Download Data Files

Frequently Asked Questions

What was the goal of the project?

The goal was to gather data on the appearance of the ballots that were not certified in the November 2000 United States presidential election in Florida, and to create an archive of the markings, to be used to examine the reliability of the various voting systems used in Florida.

What was the involvement of the news organizations?

This project was conceived and sponsored by a consortium of news organizations. The news organizations were responsible for securing county cooperation, paying all associated county fees, and ensuring proper presentation of the uncertified ballots. The news organizations conducted individual analyses of the data and prepared reports for publication and broadcast.

Who collected the data?

The National Opinion Research Center (NORC) at the University of Chicago conducted the data collection. NORC is a research organization with an excellent reputation, developed over half a century, for nonpartisan data collection and analysis.

How many uncertified ballots were there?

A total of 175,010 uncertified ballots were examined, including 113,820 overvotes (the voter selected more than one candidate for president) and 61,190 undervotes (the voter did not select a candidate for president, or for some reason, the vote counting mechanism did not register a vote for president).

Sixty-five of Florida’s 67 counties use one of three voting systems. Click here to view Table 1, which presents the three voting systems and the number of counties utilizing each, as well as the number of undervotes and overvotes examined.

When did the examination begin?

An extensive training for the data collection staff began in Florida in late January 2001. Ballot examination began Monday, February 5, 2001 and continued through May.

How were the ballots examined?

In each of the counties, local election officials assigned county workers to display the ballots. Coding teams from NORC (one or three coders per team) reviewed each ballot and recorded the markings they observed. The team of coders sat side by side, but members worked independently of each other and made individual determinations of the appearance of the ballots. They did not talk among themselves or consult each other in any way.

What specifically did the coders look for?

Coders recorded the condition of each ballot examined. Thus, for Votomatic (and to some extent Datavote) ballots, coders noted whether chads were dimpled and if so, whether light was shining through the dimple. (Each coder worked with a small light table that helped coders to examine for light.) Coders also noted whether chads were completely punched, or hanging by one, two or three corners. For optical scan ballots and any Votomatic or Datavote absentee ballots completed outside the voting booth, coders noted whether the ovals or arrows were fully filled or otherwise marked (with a check, slash, X, etc.) Coders noted whether there were stray marks on the ballot that would confuse a scanning machine and whether ballots were uncertified because the wrong color ink was used. Finally, coders recorded verbatim any written notations on the ballots.

Was every ballot reviewed by three coders?

No. All undervotes and the overvotes from three test counties (Polk, Pasco, Nassau) were reviewed by three coders. In the three test counties (one Votomatic, one Datavote and one optical scan) coders reviewed each overvote to determine whether three coders were necessary. High agreement among the three coders indicated that overvotes were easier to code than undervotes and the decision was made to code the remaining overvotes with one coder. Indeed, overvotes were easier to code because it requires more than one fully punched chad or more than one fully completed oval or arrow (markings more easily identified than dimples) to produce an overvote.

What was done to ensure accuracy in the field?

Because the data set is intended to be the authoritative description of the uncertified ballots, number of steps were taken to ensure high quality. First, only qualified individuals were hired to review the ballots. Because of the nature of the task, all coders were administered a near-point vision test before being staffed on the project. Project coders were trained and tested on coding procedures before being allowed to code. Team leaders -- who were long-term NORC employees -- conducted the training and worked closely with the coders to ensure consistently high performance. Every evening, prior to shipping the coding forms to NORC, the team leaders reviewed the forms for completeness and legibility of coding. NORC also attempted to verify the accuracy of the coding by randomly selecting ballots from every county to recode. These recodings were later matched with the original codings and reviewed for consistency of coding.

What happened next with the data?

Information on the ballot markings was recorded on coding forms that were sent to the NORC offices daily. At NORC, a trained team of data entry specialists entered the information into electronic files.

What was done to ensure accuracy of the data entry?

Each data form was entered twice (by two different data entry clerks). The results of both data entry tasks were compared and data entry supervisors conducted an adjudication process. Differences between the two data entries were reviewed and the appropriate corrections were made. Supervisors consulted coding forms as necessary. Typos, out of range codes and other anomalies were reconciled during this process.

Were there other steps as well?

For the final data review step, NORC assigned an independent team of statisticians to examine the data. These statisticians reviewed the data and approved them for release to the media group and to the public.

What is the final product?

NORC compiled 17.5 million pieces of information into two primary data sets. One is a ballot-level database (the raw database) that contains information on every chad or candidate space on every ballot across the 67 counties. This file does not attempt to align candidate information across ballots; it simply reflects the reality of the disparate ballot designs used throughout the state of Florida. The second is an aligned database that does reconcile every coder's information for every ballot for each presidential and U.S. Senate candidate. This file contains the first processing step necessary to facilitate comparison of the codings for each candidate regardless of his or her various ballot positions across the state. The raw database is the definitive historical archive of every mark on the uncertified ballots. The aligned file is an analyst's tool, presenting only the markings related to the various candidate positions on each county's ballot.

Secondary data sets include the ballot notations copied verbatim by NORC’s coders, the demographic characteristics of the coders, the recoding data collected while coding, and a number of files produced by the media group. The media group files contain qualitative and quantitative county- and precinct-level information used by the media in their analyses.

Can I obtain the data?

Yes. The files are now publicly available, and may be downloaded at no charge from the 2000 Florida Ballots Project website, located at "http://www.electionstudies.org/florida2000".

The 2000 Florida Ballots Project website and files are maintained as a service to the public by the American National Election Studies (ANES) at the University of Michigan.