Years-Long Court Battle in Georgia Reveals Dominion’s Security Flaws, Weak Testing

Years-Long Court Battle in Georgia Reveals Dominion’s Security Flaws, Weak Testing
A Fulton County employee moves voting machine transporters to be stored at the the Fulton County Election Preparation Center in Atlanta, Ga., on Nov. 4, 2020. (Jessica McGowan/Getty Images)
Jeff Carlson
12/9/2020
Updated:
12/9/2020
News Analysis
Behind the current controversy surrounding the integrity of results from the Nov. 3 presidential election in Georgia are years of court battles over an outdated voting system and the controversial $107 million purchase of new touchscreen machines from Dominion Voting Systems in July 2019.
A review of court documents and sworn expert testimonies raise troubling questions about the Dominion voting system and its rushed implementation by the State of Georgia.

Among the many issues raised was the inability to accurately audit Dominion’s systems in order to verify that  votes were cast as intended. Audit and cybersecurity experts also demonstrated to the court how the Dominion system inherently prevented the successful use of risk-limiting audits (RLA)—a method employed by Georgia during the recount.

Cybersecurity experts provided evidence to the court that Dominion’s QR system wasn’t secure, was subject to duplication, and that the ability to generate fake QR codes existed. A nationally recognized cybersecurity expert also found that during Georgia’s August 2020 elections, servers at two county election offices he visited “enabled unsafe remote access to the system through a variety of means,” including the use of flash drives.

This same expert found that in one of those counties, “server logs were not regularly recording or updated in full and that Dominion’s technical staff maintained control over the logs and made deletions in portions of the logs.”

Computer science experts also found significant problems with the testing processes used by Pro V&V in the testing of the Dominion equipment. In a case that involved last-minute updates to Dominion software in late September 2020, the court was told that the testing lab “performed only cursory testing of this new software.”

Additionally, during testing in 2019, a Dominion system experienced what was termed a “memory lockup” after scanning only 4,500 ballots. An analysis from Dominion determined that a “power cycle” of the unit is required after scanning more than 4,000 ballots. It isn’t known if this issue was fixed prior to the 2020 elections or if election workers were properly trained in the event the issue was still present in the Dominion systems.

The court also found that the manner in which the Dominion system functions failed to meet the requirements of Georgia election law. As U.S. District Judge Amy Totenberg noted, Dominion’s system “does not produce a voter-verifiable paper ballot or a paper ballot marked with the voter’s choices in a format readable by the voter because the votes are tabulated solely from the unreadable QR code.”

In response to the problems presented to the court,  Totenberg issued a ruling, noting  “demonstrable evidence” that the implementation of Dominion’s systems by the State of Georgia places voters at an “imminent risk of deprivation of their fundamental right to cast an effective vote,” which the judge defined simply as a “vote that is accurately counted.”
However, Totenberg ruled that “despite the profound issues raised … the Court cannot jump off the legal edge and potentially trigger major disruption in the legally established state primary process.”

Dominion Systems Don’t Conform to State Law

While acknowledging that Georgia’s Election Code mandates the use of a ballot marking system (BMD) as the method of voting in Georgia, Totenberg also noted there are certain legal requirements that must be concurrently met in the use of such a system:
“The statutory provisions mandate voting on “electronic ballot markers” that: (1) use “electronic technology to independently and privately mark a paper ballot at the direction of an elector, interpret ballot selections, communicate such interpretation for elector verification, and print an elector verifiable paper ballot;” and (2) “produce paper ballots which are marked with the elector’s choices in a format readable by the elector.””
And as noted by the judge, the Dominion systems and equipment purchased by the State of Georgia failed to conform to the state’s own legal requirements:
“The evidence shows that the Dominion BMD system does not produce a voter-verifiable paper ballot or a paper ballot marked with the voter’s choices in a format readable by the voter because the votes are tabulated solely from the unreadable QR code. Thus, under Georgia’s mandatory voting system for “voting at the polls” voters must cast a BMD-generated ballot tabulated using a computer generated barcode that has the potential to contain information regarding their voter choices that does not match what they enter on the BMD (as reflected in the written text summary), or could cause a precinct scanner to improperly tabulate their votes.”
In other words, the equipment, as provided by Dominion and put in place by the state, failed to meet the legal requirements that Georgia has in place for a voting system.

Risk-Limiting Audits Deemed Unreliable

Totenberg also addressed the use of risk-limiting audits (RLA), a statistical methodology used to audit election outcomes before they become official that has been endorsed by the National Academy of Sciences, Engineering, and Medicine.

As the judge noted, the consensus among experts is that “the best audit trail is voter-marked paper ballots.” By contrast, “voter-verifiable paper records printed by voting machines are not as good.”

Georgia’s use of the new Dominion machines created a particular problem regarding the performance of a successful RLA, precisely because the system “by its nature, erases all direct evidence of voter intent.” As Totenberg stated, “There is no way to tell from a BMD printout what the voter actually saw on the screen, what the voter did with the device, or what the voter heard through the audio interface.”

This creates a situation in which auditors are severely limited and “can only determine whether the BMD printout was tabulated accurately, not whether the election outcome is correct.“ Totenberg stated in her ruling that a BMD printout “is not trustworthy” and the application of an RLA to an election that used BMD printouts “does not yield a true risk-limiting audit.”

Election security expert J. Alex Halderman noted the same issues in a sworn declaration, telling the court, “if voters do not reliably detect when their paper ballots are wrong, no amount of post-election auditing can detect or correct the problem.”

VotingWorks Employed Risk-Limiting Audit in Georgia

During the court proceedings, two contrasting views were presented. Ben Adida, founder and executive director of VotingWorks, claimed that as long as “voters verify the text, and as long as RLAs are conducted on the basis of the same ballot text, then potential QR code mismatches are caught just like any other tabulation mistake might be caught.”

But Adida’s position was heavily criticized by Philip Stark, a “preeminent renowned statistician,” who is  the “original inventor and author of the risk-limiting audit (“RLA”) statistical methodology.”

Stark noted that Adida’s premise relies on the assumption that voters will actually review and verify their ballot selections on their ballot printout. But “overwhelming evidence from actual studies” of voter behavior “suggests that less than ten percent of voters check their printouts and that voters who do check often overlook errors.”

Therefore, following an actual election—such as the Nov. 3 presidential election—there is simply no way to ascertain how many voters actually checked their BMD printouts for accuracy, inherently impairing, and perhaps destroying, the value of a post-election audit.

Additionally, Stark “categorically” disagreed with Adida’s position that a post-election audit can establish that the voting systems actually functioned correctly during the elections. As  Stark told the court, “audits of BMD-marked ballot printouts cannot reliably detect whether malfunctioning BMDs printed the wrong votes or omitted votes or printed extra votes.”

Notably, Stark testified that “this is true, even if the malfunctions were severe enough to make losing candidates appear to win.”

Despite the significant issues noted by  Stark, the State of Georgia had already “contracted with Adida’s VotingWorks for guidance in the development and implementation of a RLA.”

Indeed, VotingWorks was used by Georgia to perform its risk-limiting audit of the Nov. 3 presidential election:
“Georgia’s first statewide audit successfully confirmed the winner of the chosen contest and should give voters increased confidence in the results,” said Ben Adida, Executive Director of VotingWorks. “We were proud to work with Georgia on this historic audit. The difference between the reported results and the full manual tally is well within the expected error rate of hand-counting ballots, and the audit was a success.”
Despite the material flaws inherent to employing a RLA, particularly given Georgia’s statewide implementation of Dominion’s voting systems, this didn’t keep Secretary of State Raffensperger from announcing the results of the audit as certain:
“Secretary of State Brad Raffensperger announced the results of the Risk Limiting Audit of Georgia’s presidential contest, which upheld and reaffirmed the original outcome produced by the machine tally of votes cast. Due to the tight margin of the race and the principles of risk-limiting audits, this audit was a full manual tally of all votes cast. The audit confirmed that the original machine count accurately portrayed the winner of the election.”
Raffensperger’s statement is in direct conflict with that of Totenberg, who noted that “there is no audit remedy that can confirm the reliability and accuracy of the BMD system, as Dr. Stark has stressed.”

Encryption Claim Disputed by Court

Judge Totenberg noted that Georgia had presented the cybersecurity of Dominion’s systems as “reliable and fortified,” based on the testimonies of Eric Coomer, Dominion’s director of product strategy and security, and Jack Cobb, the laboratory director for Pro V&V.

Notably, Secretary of State Raffensperger had retained Pro V&V to perform a review of the Dominion system purchased by Georgia. One of the representations made by Cobb was that the system’s security was “fortified by the encryption of the QR code [a scan code produced by the ballot machine after voting] and accompanying digital signature code as well as various other security measures.”

Despite Cobb’s claims of “fourteen years of experience in testing voting machines,” the court found that Cobb lacked “any specialized expertise in cybersecurity testing or analysis or cybersecurity risk analysis.” Additionally, the court found that Cobb “had not personally done any of the security testing referenced in his affidavits.”

During testimony, Cobb testified that QR codes on Dominion’s printed ballots are encrypted, but that assertion was immediately disputed by plaintiff experts. Under questioning, the court found that Cobb was using Dominion’s documentation for his claim that QR codes were encrypted, but hadn’t actually tested the claim.

The court found that the “evidence plainly contradicts any contention that the QR codes or digital signatures are encrypted,” and pointed out that this was “ultimately conceded by Mr. Cobb and expressly acknowledged later by Dr. Coomer during his testimony.”

The court also heard from Vincent Lui, a leading international cybersecurity analyst who is CEO of cybersecurity firm Bishop Fox. He started with the National Security Agency as a global network exploitation analyst, and led the global penetration team for Honeywell International.

During testimony, Lui “addressed head-on the inaccuracy of any contention that the QR code or signature utilized in the Dominion BMD system in Georgia is encrypted.” Lui noted that while his firm was’t not granted physical access to the Dominion machines in Georgia, it was  able to develop code that read the Dominion QR code. From there, they extracted the raw data and determined that the Dominion QR code wasn’t encrypted and the ability to generate fake QR codes existed:
“In this case, public-key cryptography was not being used with QR codes. And so the implication is that with the BMDs and the generation of the QR codes  themselves -- the implication with the design of the Dominion BMD system is that any device that has necessary keys to operate would be able to generate a fake QR code. And you would not be able to determine which machine generated it, whether it was the EMS, the BMD, the ICP, or any other system that had that key loaded on to it.”
This conclusion was echoed by Halderman (also an expert witness in the court case) during an Oct. 26 interview with PBS Newshour. Halderman stated that by “analyzing the structure of the QR codes, I have been able to learn that there’s nothing that stops an attacker from just duplicating one, and the duplicate would count the same as the original bar code.”

Lui noted that if you have “an infected BMD that has been compromised [by malware], it can just tell you whatever value that it wants. [A]s it is deployed within the Dominion devices, it does not appear to be used in a fashion that could be considered secure. It can easily be circumvented.”

Lui’s concerns with the security of the Dominion system didn’t not end there. He informed the court that the underlying Android operating system was “over half a decade out of date,” and also noted that the Dominion system used “USB devices and portals,” which Lui considered to be “fraught with security concerns.”

Lui concluded that “the design of the security of the BMD [Dominion] system is not secure and requires a more in-depth review.”

Testimony From Hursti

Harri Hursti was described by the court as “a nationally recognized cybersecurity expert who has worked in security-oriented IT technology for over 30 years, with a particular expertise in the knowledge, observation, and prevention of malicious activities in networked environments.” He also organized the 2018 Voting Machine Hacking Village, “for which he was awarded a Cyber Security Excellence Award.”

Hursti told the court that to “my knowledge, no jurisdiction has permitted, and Dominion has not permitted, independent research, academic or otherwise, to be conducted on its systems, which greatly limits the number of people with any experience with the Dominion systems.”

But Hursti was present as an observer during the June 9 statewide primary election in Georgia and the runoff elections on Aug. 11.

According to the court, Hursti found that Georgia’s Dominion “election servers enabled unsafe remote access to the system through a variety of means, extending from frequent use of flash drives and accessing of the internet to the use of outside unauthorized applications (such as game programs) residing on election management and tabulation servers and other practices.”

Hursti told the court that “these practices drive a hole through the essential cybersecurity foundation requirement of maintaining a ‘hardened’ server” and that without these basic protections, malware can far more easily penetrate the server and the operative BMD system software.”

Perhaps more troubling, Hursti found that in one Georgia county, “server logs were not regularly recording or updated in full and that Dominion’s technical staff maintained control over the logs and made deletions in portions of the logs.”

As he testified, secure and complete logs “are essential as the most basic feature of system security as they provide the detailed activity trail necessary for the identification of security threats and server activity and are required for purposes of conducting a sound audit.”

Mr. Hursti succinctly presented his opinion that given the irregularities that he observed as a cybersecurity expert, he had serious doubt that the system was operating correctly and that “when you don’t have an end-to-end chain of the voter’s intent” and when a system could be maliciously or unintentionally compromised, there is no capability of auditing the system results.

Troubling Testing Results from Pro V&V

On Aug. 7, 2019, Wendy Owens of Pro V&V submitted a Test Report related to the Dominion System 5.5-A for Georgia. The certification testing from Pro V&V was later described by  Halderman in court documents as being “limited to checking functional compliance with Georgia requirements.”
Indeed, Pro V&V’s own report notes that the “state certification test was not intended to result in exhaustive tests of system hardware and software attributes”

The report, although generally sparse, did contain one particular item of material significance. During testing, the Dominion system experienced what was termed a “memory lockup” after scanning only 4,500 ballots.

The issue was noted and presented to Dominion for resolution. The analysis from Dominion, although brief, raises material security concerns about on-the-ground system operation.

Dominion noted in a statement included in the test report that the operating system lacked a memory management system and was therefore subject to “memory fragmentation” when the ImageCast Precinct Optical Scanner (ICP) processes more than 4,000 ballots:
“The ICP uClinux operating system does not have a memory management unit (MMU) and, as such, it can be susceptible to memory fragmentation. The memory allocation services within the ICP application are designed to minimize the effects of memory fragmentation. However, if the ICP scans a large number of ballots (over 4000), without any power cycle, it can experience a situation where the allocation of a large amount of memory can fail at the Operating System level due to memory fragmentation across the RAM. This situation produces an error message on the ICP which requires the Poll Worker to power cycle the unit, as documented. Once restarted, the ICP can continue processing ballots without issue. All ballots scanned and counted prior to the power cycle are still retained by the unit; there is no loss in data.”
According to the report from Owens, “Pro V&V performed a power cycle, as instructed by Dominion, and verified that the issue was resolved and that the total ballot count was correct. Scanning then resumed with no additional issues noted.”

Despite the potential for significant operational problems during an election, the Pro V&V report stated that Dominion’s system “successfully passed the Accuracy Test” and Owens, the individual performing the testing, only raised the fragmentation issue as a side note. Stating that Pro V&V “performed a power cycle, as instructed by Dominion, she verified that the issue was resolved.”

The designation of “resolved” by Pro V&V is particularly troubling, given that Dominion’s analysis appears to indicate an ongoing occurrence of this issue:
“If the ICP scans a large number of ballots (over 4000), without any power cycle, it can experience a situation where the allocation of a large amount of memory can fail at the Operating System level due to memory fragmentation across the RAM.”
That the solution to this seemingly significant issue is for on-site voting workers to “power cycle the unit” whenever the ICP surpasses 4,000 ballots raises material concerns The requirement to perform this operation at the level of on-site ballot-counting seems cumbersome and prone to worker error and confusion.
It remains unknown if this issue was  rectified in Dominion’s Democracy Suite Version 5.5-C, which was used in the Nov. 3 presidential election. The issue wasn’t raised in later Pro V&V testing reports that were reviewed for this article.

Dominion’s Pre-Election ‘Software Patch’

During pre-election testing of Dominion’s voting systems in late September, Georgia officials discovered a problem relating to the displays for the U.S. Senate race, finding that under certain circumstances, not all of the candidates’ names would fit properly onto a single screen.

Dominion embarked on a software modification to address the problem, which required testing validation from Pro V&V, as the software had now been changed across the Dominion systems. That led to some disagreement as to the breadth of the software changes made and the possible need for resulting system re-certification due to those changes.

Eric Coomer, Dominion’s director of product strategy and security, told the court that it was his belief the software change “was de minimis.” He stated that Dominion didn’t make that determination, but instead “submit that change to an accredited laboratory,” in this case, Pro V&V.

The official designation of the software being deemed “de minimis” was important, as it would have bearing on the need for complete Election Assistance Commission (EAC) recertification of the Dominion systems—something that might require more time than was available ahead of the Nov. 3 presidential election.

If the software change was deemed de minimis, it’s then submitted to the EAC as an engineering change order or “ECO.” As Coomer testified, “So there is no new EAC certification effort. It is simply updating the current certification for this ECO.”

On Oct. 2, a letter from Pro V&V’s Owens  was sent, confirming “that this version of the ICX software corrected the issue with displaying of two-column contests.” The letter concluded with a recommendation from Pro V&V that the software change to Dominion’s systems be “deemed as de minimis.”

Halderman’s Criticisms of Pro V&V’s Work

The report from Pro V&V was heavily criticized by Halderman in an Oct. 3 sworn declaration submitted to the court. Importantly, this declaration not only criticizes the most recent testing by Pro V&V but also cast further doubt on the breadth of the testing done by Pro V&V in general.

According to the court,  Halderman is “a Professor of Computer Science and Engineering and Director of the University of Michigan Center for Computer Security and Society. He is a nationally recognized expert in cybersecurity and computer science in the elections field.”

Halderman told the court that following his review of the “Letter Report” from Pro V&V, the “report makes clear that Pro V&V performed only cursory testing of this new software. The company did not attempt to independently verify the cause of the ballot display problem, nor did it adequately verify that the changes are an effective solution.” Nor did Pro V&V test whether the changes made by Dominion to fix the display problem had any effectt on the “reliability, accuracy, or security of the BMD system.”

Halderman told the court that the “superficial testing” was particularly concerning, as it became apparent that the source-code changes made by Dominion were “considerably more complicated than what Dr. Coomer previously testified was the threshold for considering a change to be “de minimis.”

Halderman noted that one particular change involved “changing a “variable declaration” to modify the “type” of a variable,” while Hursti noted that this type of change has the potential to permeate the system and change how the program operates “everywhere the variable is used.”

He noted that it wasn’tpossible to evaluate the effects from the software change by examining only the source-code changes. Instead, he testified that an expert in programming language should have been engaged for the examination, but wasn’t.

Halderman told the court that “rigorous regression testing” was essential in order to ensure that the system’s “reliability, accuracy, and security are not degraded.” Instead, Pro V&V was solely focused on verifying the ballot display issue was rectified without testing whether the overall functionality of the system remained intact:
“They did not test that the other functionalities of the machine are not impacted by the change. They did not test that the BMD selected and printed results accurately, nor did they test that security was unaffected. Tests only answer the questions you ask. Here—regardless of what Pro V&V intended—the only questions asked were: “Is the stated error observed when using the old software?” and “Is the stated error observed when using the new software?” They did not ask, “Is Dominion correct about the cause of the problem?” They did not ask, “Does this change absolutely and completely fix the issue?” Most importantly, they never asked or answered the key question for determining whether the change is de minimis, “Will these modifications have any impact on the rest of the voting system’s functionality?”
Halderman also questioned the EAC’s pending acceptance of Pro V&V’s recommendation to deem the change “de minimis,” noting that this carveout for small software changes was relatively new and had only been used “on one or two occasions.”

Hursti closed by stating that he didn’t believe the EAC could make a determination that the new Dominion software met the de minimis criteria and stated that he hoped the “agency demands more rigorous testing before allowing the software to be used under its certification guidelines.”

The EAC accepted Pro V&V’s recommendation without question and certified the Dominion software directly ahead of the Nov. 3 presidential election.

Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times.