Electronic voting in the United States


Electronic voting in the United States involves several types of machines: touch screens for voters to mark choices, scanners to read paper ballots, scanners to verify signatures on envelopes of absentee ballots, and web servers to display tallies to the public. Aside from voting, there are also computer systems to maintain voter registrations and display these electoral rolls to polling place staff.
Election machines are computers, often 10-20 years old, since certification and purchase processes take at least two years, and offices lack money to replace them until they wear out. Like all computers they are subject to errors, which have been widely documented, and hacks, which have not been documented, though security flaws which would permit undetectable hacks have been documented.
Most election offices handle thousands of ballots, with an average of 17 contests per ballot,
so machine-counting can be faster and less expensive than hand-counting.

Voluntary guidelines

The Election Assistance Commission is an independent agency of the United States government which developed the 2005 Voluntary Voting System Guidelines. These guidelines address some of the security and accessibility needs of elections. The EAC also accredits three test laboratories which manufacturers hire to review their equipment. Based on reports from these laboratories the EAC certifies when voting equipment complies with the voluntary guidelines.
Twelve states require EAC certification for machines used in their states. Seventeen states require testing by an EAC-accredited lab, but not certification. Nine states and DC require testing to federal standards, by any lab. Four other states refer to federal standards but make their own decisions. The remaining eight states do not refer to federal standards.
Certification takes two years, costs a million dollars, and is needed again for any equipment update, so election machines are a difficult market.
A revision to the guidelines, known as the VVSG 1.1, was prepared in 2009 and approved in 2015. Voting machine manufacturers can choose which guidelines they follow. A new version has been written known as the VVSG 2.0 or the VVSG Next Iteration, which is being reviewed.

Optical scan counting

In an optical scan voting system, each voter's choices are marked on one or more pieces of paper, which then go through a scanner. The scanner creates an electronic image of each ballot, interprets it, creates a tally for each candidate, and usually stores the image for later review.
The voter may mark the paper directly, usually in a specific location for each candidate, then mail it or put it in a ballot box.
Or the voter may select choices on an electronic screen, which then prints the chosen names, usually with a bar code or QR code summarizing all choices, on a sheet of paper to put in the scanner. This screen and printer is called an electronic ballot marker or ballot marking device, and voters with disabilities can communicate with it by headphones, large buttons, sip and puff, or paddles, if they cannot interact with the screen or paper directly. Typically the ballot marking device does not store or tally votes. The paper it prints is the official ballot, put into a scanning system which counts the barcodes, or the printed names can be hand-counted, as a check on the machines. Most voters do not look at the paper to ensure it reflects their choices, and when there is a mistake, 93% of voters do not report it to poll workers.
Two companies, Hart and Clear Ballot, have scanners which count the printed names, which voters had a chance to check, rather than bar codes and QR codes, which voters are unable to check. When scanners use the bar code or QR code, the candidates are represented in the bar code or QR code as numbers, and the scanner counts those codes, not the names. If a bug or hack makes the numbering system in the ballot marking device different from the numbering system in the scanner, votes will be tallied for the wrong candidates. This numbering mismatch has appeared with direct recording electronic machines.

Errors in optical scans

Scanners have a row of photo-sensors which the paper passes by, and they record light and dark pixels from the ballot. A black streak results when a scratch or paper dust causes a sensor to record black continuously. A white streak can result when a sensor fails.
In the right place, such lines can indicate a vote for every candidate or no votes for anyone. Some offices blow compressed air over the scanners after every 200 ballots to remove dust.
Software can miscount; if it miscounts drastically enough, people notice and check. Staff rarely can say who caused an error, so they do not know whether it was accidental or a hack.
Recreated ballots are paper
or electronic
ballots created by election staff when originals cannot be counted for some reason. Reasons include tears, water damage, folds which prevent feeding through scanners and voters selecting candidates by circling them or other abnormal marks. Reasons also include citizens abroad who use the Federal Write-In Absentee Ballot because of not receiving their regular ballot in time. As many as 8% of ballots in an election may be recreated.
When auditing an election, audits are done with the original ballots, not the recreated ones, to catch mistakes in recreating them.

Cost of scanning systems

If most voters mark their own paper ballots and one marking device is available at each polling place for voters with disabilities, Georgia's total cost of machines and maintenance for 10 years, starting 2020, has been estimated at $12 per voter. Pre-printed ballots for voters to mark would cost $4 to $20 per voter. The low estimate includes $0.40 to print each ballot, and more than enough ballots for historic turnout levels. the high estimate includes $0.55 to print each ballot, and enough ballots for every registered voter, including three ballots for each registered voter in primary elections with historically low turnout. The estimate is $29 per voter if all voters use ballot marking devices, including $0.10 per ballot for paper.
The capital cost of machines in 2019 in Pennsylvania is $11 per voter if most voters mark their own paper ballots and a marking device is available at each polling place for voters with disabilities, compared to $23 per voter if all voters use ballot marking devices. This cost does not include printing ballots.
New York has an undated comparison of capital costs and a system where all voters use ballot marking devices costing over twice as much as a system where most do not. The authors say extra machine maintenance would exacerbate that difference, and printing cost would be comparable in both approaches. Their assumption of equal printing costs differs from the Georgia estimates of $0.40 or $0.50 to print a ballot in advance, and $0.10 to print it in a ballot marking device.

Direct-recording electronic counting

A touch screen displays choices to the voter, who selects choices, and can change her mind as often as needed, before casting the vote. Staff initialize each voter once on the machine, to avoid repeat voting. Voting data and ballot images are recorded in memory components, and can be copied out at the end of the election.
The system may also provide a means for communicating with a central location for reporting results and receiving updates,
which is an access point for hacks and bugs to arrive.
Some of these machines also print names of chosen candidates on paper for the voter to verify. These names on paper can be used for election audits and recounts if needed. The tally of the voting data is stored in a removable memory component and in bar codes on the paper tape. The paper tape is called a Voter-verified paper audit trail. The VVPATs can be counted at 20–43 seconds of staff time per vote.
For machines without VVPAT, there is no record of individual votes to check.

Errors in direct-recording electronic voting

This approach can have software errors. It does not include scanners, so there are no scanner errors. When there is no paper record, it is hard to notice or research most errors.
Email, fax, phone apps, and web portals transmit information through the internet, between computers at both ends, so they are subject to errors and hacks at the origin, destination and in between. Security experts have found security problems in every attempt at online voting,
including systems in Australia,
Estonia,
Switzerland,
Russia,
and the United States.
States that allow remote electronic voting in the United States are:
The Uniformed and Overseas Citizens Absentee Voting Act lets overseas citizens and all military and merchant marine voters get ballots electronically. They then submit ballots by mail to 18 states. Elsewhere they submit by email, fax, or secure web site,
including from space.
Researchers have found insecurities in online voting systems from Voatz,
and Democracy Live.
In 2010, graduate students from the University of Michigan hacked into the District of Columbia online voting systems during an online voting mock test run and changed all the cast ballots to cater to their preferred candidates. This voting system was being tested for military voters and overseas citizens, allowing them to vote on the Web, and was scheduled to run later that year. It only took the hackers, a team of computer scientists, thirty-six hours to find the list of the government's passwords and break into the system.
In March 2000 the 2000 Arizona Democratic presidential primary internet election was conducted over the internet using the private company votation.com. Each registered member of the party received a personal identification number in the mail. They could vote in person or over the internet, using their PIN and answering two questions such as date and place of birth. During the election older browsers failed, but no hacks were identified.

Electronic processing of postal and absentee ballots

Checking signatures on envelopes of absentee ballots is hard, and is often computerized in jurisdictions with many absentee ballots. The envelope is scanned, and the voter's signature on the outside of the envelope is instantly compared with one or more signatures on file. The machine sets aside non-matches in a separate bin. Temporary staff then double-check the rejections, and in some places check the accepted envelopes too.
Error rates of computerized signature reviews are not published. "A wide range of algorithms and standards, each particular to that machine's manufacturer, are used to verify signatures. In addition, counties have discretion in managing the settings and implementing manufacturers' guidelines… there are no statewide standards for automatic signature verification… most counties do not have a publicly available, written explanation of the signature verification criteria and processes they use"
Handwriting experts agree "it is extremely difficult for anyone to be able to figure out if a signature or other very limited writing sample has been forged,"
The National Vote at Home Institute reports that 17 states do not mandate a signature verification process.
The Election Assistance Commission says that machines should be set only to accept nearly perfect signature matches, and humans should doublecheck a sample, but EAC does not discuss acceptable error rates or sample sizes.
In the November 2016 general election, rejections ranged from none in Alabama and Puerto Rico, to 6% of ballots returned in Arkansas, Georgia, Kentucky and New York.
Where reasons for rejection were known, in 2018, 114,000 ballots arrived late, 67,000 failed signature verification, 55,000 lacked voter signatures, and 11,000 lacked witness signatures in states which require them.
The highest error rates in signature verification are found among lay people, higher than for computers, which in turn make more errors than experts.
Researchers have published error rates for computerized signature verification. They compare different systems on a common database of true and false signatures. The best system falsely rejects 10% of true signatures, while it accepts 10% of forgeries. Another system has error rates on both of 14%, and the third-best has error rates of 17%.
It is possible to be less stringent and reject fewer true signatures, at the cost of also rejecting fewer forgeries, which means erroneously accepting more forgeries.
Vendors of automated signature verification claim accuracy, and do not publish their error rates.
Voters with short names are at a disadvantage, since even experts make more mistakes on signatures with fewer "turning points and intersections."

State and local websites for election results

Election offices display election results on the web by transferring USB drives between offline election computers, and online computers which display results to the public. USB drives can take infections from the online computers to the election computers. Local governments communicate electronically with their state governments so the state can display results, with the result that problems at the state level can affect all or many local offices.
Election-reporting websites run software to aggregate and display results. These have had programming errors which showed erroneous partial results during the evening, and the wrong winner.
Before the 2016 general election, Russians gained access to at least one employee's account at a vendor which manages election-reporting websites. During the 2018 general election, a hacker in India gained administrative access to the Alaska election-reporting website.
Studies by McAfee and ProPublica in 2020 found that most election websites have inadequate security. McAfee analyzed swing states. ProPublica analyzed Super Tuesday states. They found many offices using outdated, insecure, dangerous and inappropriate software, including unsupported operating systems, and using the same few web hosts, which they said is dangerous for critical infrastructure, since finding a flaw in one can lead to access to them all. They criticized offices for not using https encryption, and for public sitenames ending in.com or.org, since it leads voters to trust sites which are not.gov, and voters can easily be tricked by a similar name.

Election security

Decentralized system

In 2016 Homeland Security and the Director of National Intelligence said that United States elections are hard to hack, because they are decentralized, with many types of machines and thousands of separate election offices operating under 51 sets of state laws.
In 2018 a McAfee expert agreed decentralization makes hacking hard, but added it makes defense hard, and a nation state could hack multiple places. In any case each city or county election is run by one office, and a few large offices affect state elections. County staff cannot in practice defend against the Russian state.

Security reviews

The Brennan Center summarized almost 200 errors in election machines from 2002-2008, many of which happened repeatedly in different jurisdictions, which had no clearinghouse to learn from each other.
More errors have happened since then.
Machines in use are not examined to determine if they have been hacked, so no hacks of machines in use have been documented. Researchers have hacked all machines they have tried, and have shown how they can be undetectably hacked by manufacturers, election office staff, pollworkers, voters and outsiders and by the public. The public can access unattended machines in polling places the night before elections. Some of the hacks can spread among machines on the removable memory cards which tell the machines which races to display, and carry results back to the central tally location.
In December 2007 the Ohio Secretary of State Jennifer Brunner released the results of a comprehensive review of Ohio's electronic voting technology. The study examined electronic voting systems – both touch-screen and optical scan – from Election Systems & Software, Hart InterCivic, and Premier Election Systems.
Three teams of security researchers, based at the Pennsylvania State University, the University of Pennsylvania, and WebWise Security, Inc., conducted the security reviews. The teams had access to voting machines and software source code from the three vendors and performed source code analysis and security penetration testing with the aim of identifying security problems that might affect the integrity of elections that use the equipment.
Besides specific problems in each system, the Ohio report noted that all
In August 2007, California Secretary of State Debra Bowen announced results of a "top-to-bottom review" of security of all electronic voting systems in the state, including Diebold Election Systems, Hart InterCivic, Sequoia Voting Systems and Elections Systems and Software. An August 2 report by computer security experts from the University of California found flaws in voting system source code. On July 27 "red teams" reported on "worst case" Election Day scenarios, where they identified vulnerabilities to tampering or error. The Top to Bottom review also included a comprehensive review of manufacturer documentation as well as a review of accessibility features and alternative language requirements.
The California security experts found significant security flaws in all of the manufacturers' voting systems, flaws that could allow a single non-expert to compromise an entire election.
The July and August reports found that three of the tested systems fell far short of the minimum requirements specified in the EAC 2005 Voluntary Voting System Guidelines.
On August 3, 2007, Bowen decertified machines that were tested, and also the ES&S InkaVote machine, which was not included in the review because the company submitted it past the deadline for testing. Some of the systems tested were conditionally recertified with new stringent security requirements imposed. The companies in question had until the February 2008 California Presidential Primaries to fix their security issues and ensure that election results could be closely audited.
California has continued to report on security of newer election machines.
In 2006, Princeton University computer scientists studied security of the Premier Election Solutions AccuVote-TSx voting system for a group of New Jersey counties. Their results showed that the AccuVote-TSx was insecure and could be "installed with vote-stealing software in under a minute." The scientists also said that machines can transmit computer viruses from one to another "during normal pre- and post-election activity."

Audits

Five states check all contests by hand tallies in a small percent of locations, AK, CA, PA, UT, WV, though California excludes about half the ballots, the ones counted after election day, and Alaska excludes small precincts.
Two states check all contests by machines independent of the election machines, in a small percent of locations, NY, VT.
Seventeen states check one or a few contests by hand, usually federal races and the governor; most local contests are not checked.
Four states reuse the same machines or ballot images as the election, so errors can persist, CT, IL, MD, NV.
Sixteen states do not require audits, or only in special circumstances.
In seven states many voters still lack paper ballots, so audits are not possible. IN, KY, LA, MS, NJ, TN, TX.

Election companies

Three vendors sell most of the machines used for voting and for counting votes. As of September 2016, the American Election Systems & Software served 80 million registered voters, Canadian Dominion Voting Systems 70 million, American Hart InterCivic 20 million, and smaller companies less than 4 million each.
More companies sell signature verification machines: ES&S, Olympus, Vantage, Pitney Bowes, Runbeck, and Bell & Howell.
Amazon provides election websites in 40 states, including election-reporting sites in some of them. A Spanish company, Scytl, manages election-reporting websites statewide in 12 U.S. states, and in another 980 local jurisdictions in 28 states.
Another website management company is VR Systems, active in 8 states.
Maryland's election website is managed by a company owned by an associate of Russian President Putin.

Timeline of development

In the summer of 2004, the Legislative Affairs Committee of the Association of Information Technology Professionals issued a nine-point proposal for national standards for electronic voting. In an accompanying article, the committee's chair, Charles Oriez, described some of the problems that had arisen around the country.
Legislation has been introduced in the United States Congress regarding electronic voting, including the Nelson-Whitehouse bill. This bill would appropriate as much as 1 billion dollars to fund states' replacement of touch screen systems with optical scan voting system. The legislation also addresses requiring audits of 3% of precincts in all federal elections. It also mandates some form of paper trail audits for all electronic voting machines by the year 2012 on any type of voting technology.
Another bill, HR.811, proposed by Representative Rush D. Holt, Jr., a Democrat from New Jersey, would act as an amendment to the Help America Vote Act of 2002 and require electronic voting machines to produce a paper audit trail for every vote. The U.S. Senate companion bill version introduced by Senator Bill Nelson from Florida on November 1, 2007, necessitates the Director of the National Institute of Standards and Technology to continue researching and to provide methods of paper ballot voting for those with disabilities, those who do not primarily speak English, and those who do not have a high literacy rating. Also, it requires states to provide the federal office with audit reports from the hand counting of the voter verified paper ballots. Currently, this bill has been turned over to the United States Senate Committee on Rules and Administration and a vote date has not been set.
During 2008, Congressman Holt, because of an increasing concern regarding the insecurities surrounding the use of electronic voting technology, submitted additional bills to Congress regarding the future of electronic voting. One, called the "Emergency Assistance for Secure Elections Act of 2008", states that the General Services Administration will reimburse states for the extra costs of providing paper ballots to citizens, and the costs needed to hire people to count them. This bill was introduced to the House on January 17, 2008. This bill estimates that $500 million will be given to cover costs of the reconversion to paper ballots; $100 million given to pay the voting auditors; and $30 million given to pay the hand counters. This bill provides the public with the choice to vote manually if they do not trust the electronic voting machines. A voting date has not yet been determined.
The was among the relevant legislation introduced in the 115th Congress. The bill's provisions include designation of the infrastructure used to administer elections as critical infrastructure; funding for states to upgrade the security of the information technology and cybersecurity elements of election-related IT systems; and requirements for durable, readable paper ballots and manual audits of results of elections.