The Business of Electronic Voting

  • Ed Gerck
  • C. Andrew Neff
  • Ronald L. Rivest
  • Aviel D. Rubin
  • Moti Yung
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2339)

Abstract

This work reports on a Financial Cryptography 2001 panel where we concentrated on the emerging business of electronic voting.

Keywords

Vote System Election Rule Electronic Vote Election Integrity Vote Protocol 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    “... one of the earliest references to the security design I mentioned can be found some five hundred years ago in the Hindu governments of the Mogul period, who are known to have used at least three parallel reporting channels to survey their provinces with some degree of reliability, notwithstanding the additional efforts.” Ed Gerck, in an interview by Eva Waskell, “California Internet Voting.” The Bell, Vol. 1, No. 6, ISSN 1530-048X, October 2000. Available online at http://www.thebell.net.
  2. [2]
    Shannon, C., “A Mathematical Theory of Communication.” Bell Syst. Tech. J., vol. 27, pp. 379–423, July 1948. Available online at http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html. Shannon begins this pioneering paper on information theory by observing that “the fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.” He then proceeds to thoroughly establish the foundations of information theory, so that his framework and terminology have remained standard practice. In 1949, Shannon published an innovative approach to cryptography, based on his previous Information Theory paper, entitled Communication Theory of Secrecy Systems. This work is now generally credited with transforming cryptography from an art to a science. Shannon’s Tenth Theorem states (cf. Krippendorf and other current wording): “With the addition of a correction channel equal to or exceeding in capacity the amount of noise in the original channel, it is possible to so encode the correction data sent over this channel that all but an arbitrarily small fraction of the errors contributing to the noise are corrected. This is not possible if the capacity of the correction channel is less than the noise.”
  3. [3]
    “When we want to understand what trust is, in terms of a communication process, we understand that trust has nothing to do with feelings or emotions. Trust is that which is essential to communication, but cannot be transferred in the same channel. We always need a parallel channel. So the question is having redundancy. When we look at the trust issue in voting, it is thus simply not possible to rely on one thing, or two things even if that thing is paper. We need to rely on more than two so we can decide which one is correct. In this sense, the whole question of whether the Internet is trusted or not is simply not defined. The Internet is a communication medium and whatever we do in terms of trust, it is something that must run on parallel channels.” Ed Gerck, testimony before the California Assembly Elections & Reapportionment Committee on January 17, 2001, in Sacramento. Assemblyman John Longville (D), Chair. For an application of this model of trust to digital certificates, see “Trust Points” from http://www.mcg.org.br/trustdef.txt excerpted in “Digital Certificates: Applied Internet Security’ by J. Feghhi, J. Feghhi, and P. Williams, Addison-Wesley, ISBN 0-20-130980-7, p. 194–195, 1998.
  4. [6]
    “We say that information-theoretic privacy is achieved when the ballots are indistinguishable independent of any cryptographic assumption; otherwise we will say that computational privacy is achieved.” In Ronald Cramer, Rosario Gennaro, Berry Schoenmakers, “A Secure and Optimally Efficient Multi-Authority Election Scheme,” Proc. of Eurocrypt’97. (available online at http://www.research.ibm.com/security/election.ps).
  5. [7]
    E. Gerck, “Fail-Safe Voter Privacy”, The Bell, Vol. 1, No. 8, p. 6, 2000. ISSN 1530-048X. Available online at http://www.thebell.net/archives/thebell1.8.pdf.Google Scholar
  6. [8]
    Accuracy and Reliability are used here in the sense of standard engineering terminology, even though these different concepts are usually confused in nontechnical circles. Lack of accuracy and/or reliability introduces different types of errors: (i) Reliability affects a number of events in time and/or space, for example, errors in transfers between memory registers. We know from Shannon’s Tenth Theorem [2] that reliability can be increased so that the probability of such an error is reduced to a value as close to zero as desired. This is a capability assertion. It does not tell us how to do it, just that it is possible. This is the realm of requirements #12 and also #5, where one can specify an error rate as low as desired or, less strictly, an error rate “comparable or better than conventional voting systems”. (ii) Accuracy affects the spread of one event, for example whether a vote exists. Here, requirement #6 calls for 100% accuracy. The requirement is that no “voter-intent” or “chad” or “scanning” issue should exist—which is feasible if, for example, each voting action is immediately converted to a standard digital form that the voter verifies for that event. Accuracy error can be set to zero because 100% accuracy is attainable in properly designed digital systems that (e.g., by including the voter) have no digitization error. For an illustration of the above definitions of accuracy and reliability, see the four diagrams in http://www.safevote.com/caltech2001.ppt.
  7. [9]
    “Contra Costa Final Report” by Safevote, Inc. Available upon request. Summary available at http://www.safevote.com.
  8. [10]
    “Voting System Requirements”, The Bell newsletter, ISSN 1530-048X, February 2001, archived at http://www.thebell.net/archives/thebell2.2.pdf.

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Ed Gerck
    • 1
  • C. Andrew Neff
    • 2
  • Ronald L. Rivest
    • 3
  • Aviel D. Rubin
    • 4
  • Moti Yung
    • 5
  1. 1.Safevote.comUSA
  2. 2.VoteHere.netUSA
  3. 3.Laboratory for Computer ScienceMassachusetts Institute of TechnologyCambridgeUSA
  4. 4.AT&T Laboratories — ResearchUSA
  5. 5.CertCo Inc.USA

Personalised recommendations