Nov 08 Election Audit Reports – Part 1 – Bad Cards, Procedural Lapses Continue

Introduction

This week the University of Connecticut (UConn) VoTeR Center released reports on post-election audits and memory card testing for the November 2008 election. These reports were announced by a press release from the Secretary of the State, Susan Bysiewicz. <Press Release> <Post-Election Memory Card Report> <Post-Election Audit Report>. Today we will highlight and comment on the Memory Card Report.  In Part 2 we will highlight and comment on the Post-Election Audit Report.

We should all applaud the unique memory card testing program, yet we must also act aggressively to close the gaps it continues to expose.

Summary

From the press release:

My office entered into this historic partnership with the University of
Connecticut VoTeR Center so that we could receive an independent, unbiased accounting of Connecticut’s optical scan voting machines,” said Bysiewicz.  “The results of these two studies confirm that numbers tallied by the optical scanners were remarkably accurate on Election Day November 4, 2008.  Voters should feel confident that their votes were secure and accurately counted.

From the Post-Election Memory Card Audit Report:

In summary. (1) all cards used in the election were properly programmed, (2) cards with junk data continues to be a problem, and additional analysis is in progress to determine the cause, (3) a number of cards show that the pre-election procedures are not followed uniformly and that cards continue to be duplicated; we recommend a stronger policy statement is needed on handling the cards before and during the election an disallowing memory card duplication.

The Secretary of the State, her Office, and UConn are rightfully proud of initiating the audit in 2008 and instituting the unique memory card testing program. We recognize and appreciate that everyone works hard on these programs, performing the audits, and creating these reports including the Registrars, Secretary of the State’s staff, and UConn.   We also welcome Secretary Bysiewicz’s committment to solve the problems identified:

From the Press Release:

“Overall, I’m pleased that our first pre- and post-testing procedures with UConn demonstrate the security of our office’s chain of custody practices with election officials,” said Bysiewicz. “However, the percentage of unreadable cards is still too high and we await UConn’s forthcoming investigation into possible causes and recommended solutions for guidance on this issue. In the interim we will provide additional training to local election officials to make sure regulations concerning the handling and security of memory cards used by the optical scanners are uniformly followed throughout the State of Connecticut.”

From the Secretary of the State’s May 14th Newsletter:

Moving forward, my office will continue to improve the training we give to Registrars of Voters and local election officials to reduce any further errors in counting. We will also start new training within weeks to improve the security of memory cards used by the optical scanners to record votes on Election Day.

Our comments and concerns:

  • This is not a random audit of memory cards. We continue to applaud this unique memory card testing program, yet it is a registrar selected set of memory cards, not exhaustive, not a random sample. 297 cards used in the election were tested out of 833 districts. This opens a huge hole for covering up errors and fraud – just don’t send in your card. It also biases any statistics one way or another based on which cards tend to be sent to UConn.
  • 9% error memory card failure rate is bad enough, but is it the actual rate? 9% of all cards sent to UConn had memory problems. These are all classified as cards not used in the election, if so, the rate would be 41/142 or 29%. We wonder if many bad cards are found in the process of testing and not used in the election thus not counted in the audit. Or could some of these cards have worked in the election and failed subsequently? Bottom line we don’t know the actual failure rate % since we don’t have a random sample. The 9% is within the range of previous UConn pre and post election tests <all UConn Reports> <Our Past Commentary>
  • There is a serious failure of officials to follow procedures. A rate of 34% or 144 failures to follow procedures on 421 non-junk-data cards is a serious pervasive problem. (52-Not Set for Election, 20 Results Print Aborted, 2-Set for Election, Zero Counters, 41-Duplication Events, 29-Zero Totals Printed Before Date of Election)  In some cases multiple problems may have occurred on the same card so the number of districts detected as failing to follow procedures is likely a bit less that the 34%. (This  paragraph has corrected numbers, in an earlier version we had double counted some of the errors)Once again, this is not a random sample yet it’s a totally unacceptable level of not following procedures. The report correctly suggests these should be changed through better training and instruction to municipalities. If procedures are necessary, then when they are not followed it means there is an opportunity for problems to occur. This should cause everyone to wonder to what extent other unaudited election procedures are regularly not followed. Most procedures are in place because they are intended to prevent election day problems, errors, and fraud.  In fact, this memory card finding is very consistent with the Audit Coaltion Reports which have consistently shown a significant level of failures to follow procedures, for instance the chain-of-custody failures described in the most recent Coalition report <read>

Our bottom line:

  • A non-random partial post-election audit of memory cards is useful, but it is insufficient. A more rigorous sampling process would yield more accurate information and, just as importantly, it would eliminate the existing opportunity for errors or fraud to be covered up by not sending the cards for testing.  Last year we proposed and the GAE Committee passed 100% pre-election independent testing of memory cards.  We stand by that recommendation to protect the cards from front-end insider fraud and to make it less likely that election officials have to deal with junk data cards.  Post-election random testing or 100% testing of memory cards is also advisable.
  • How many more tests, reports, and elections will it take before the junk data problem is significantly reduced? Ridiculous, Unacceptable, Unconscionable come to mind to describe the junk data problem.  5%, 9%, 20% or even 1% is way out of line for electronic equipment.  Why do we stand for it?  What about all the other states that use this exact same technology, why are they putting up with it?
  • Almost every failure to follow procedures is an opportunity to cause problems, cover up errors, or cover up fraud. Perhaps it is easier to understand human failure to follow procedures exactly, every time.  Once again no mater if failure to follow procedures is 5%, 10%, 20% or 40% in handling memory cards it points to a likely much higher rate of failure to follow all procedures.  How can we have confidence in elections with such a lack of ability or attention to following procedures, many of which are performed outside of public view, outside of audits purview to discover.    We can only hope that the Registrars of Voters will join in the commitment to meet a much  higher standard.

************

Related story 5/28:  Diebold memory card problems in Florida — a different model, this time it is high speed wireless cards <read>

Facebooktwittergoogle_plusredditpinterestlinkedintumblrmailFacebooktwittergoogle_plusredditpinterestlinkedintumblrmail

Leave a Reply

You must be logged in to post a comment.