gnu-misc-discuss
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: GPLv3 comedy unfolding -- novalis: "GPLv3 doesn't break voting mach


From: Alexander Terekhov
Subject: Re: GPLv3 comedy unfolding -- novalis: "GPLv3 doesn't break voting machines"
Date: Tue, 15 Aug 2006 21:33:20 +0200

http://www.fsf.org/blogs/licensing/gplv3-tc-voting-machines

-------
GPLv3 and Voting Machines

    * Send this page to somebody
    * Print this page

Submitted by novalis. on 2006-08-14 06:32 PM. Licensing
GPLv3 doesn't break voting machines.

Anti-DRM and voting machines Some people are worried that GPLv3's
anti-DRM provisions will make it impossible to use the GPL for software
for voting machines. This worry ignores the actual text of the GPL, and
relies on false conceptions of how treacherous computing and voting
machines work.

In the United States, voting machines are not built to be secure. They
do not have to meet any security standards (unlike slot machines).
Because of this, no voting machine manufacturer will put in a trusted
platform module, because this would be a needless expense. Indeed,
voting machine manufacturers lobby very hard to prevent even the
rudimentary security measure of a paper trail. Given these
well-publicised dangers, it would be foolish to use electronic voting
machines today.

But let's imagine that some day, in some country, a voting machine
manufacturer (let's call them Diebold, for short) decides to build the
most secure voting system imaginable. To start with, Diebold will go to
the literature. There are at least two protocols available for secure
electronic voting: C. Andrew Neff's, and David Chaum's. Neither rely on
the security of the client at all, so the TCG's secure boot and remote
attestation protocols won't help them at all. That's because there's no
way for a checker to prove that the machine they are talking to is the
machine that the voter is seeing. Imagine that there is a client which
is secured by pure magic, so that we know for certain that when the user
presses the X button, the machine records one vote for candidate X. Now
a malicious person puts a false machine *on top of* the secure machine.
Anything that the secure machine displays on its screen to demonstrate
its security, the malicious machine simply copies via a video camera.
The false machine reads in votes for candidate X, and, with its
electronic finger, presses the Y button.

Let's imagine that for whatever reason, this attack isn't possible, and
thus Diebold does want to trust their clients using remote attestation.
They *still* can do it using GPLv3 software. But to understand why, you
need to understand some of the TCG security model.

A machine is trusted, according to TCG, "if it always behaves in the
expected manner for the intended purpose." Note the use of the passive
voice to conceal whose expectations and intents are involved -- hint:
it's not yours. That's different than the standard definition of trust,
which is that something is trusted if it can break your security policy.
Because both kinds of trust are interesting, I will use the terms
TCG-trust and Standard-trust, respectively. TCG-trust is achieved by
having the manufacturer of the Trusted Platform Module (which is
probably not Diebold, but some chip company -- call it Intel, for short)
certify that it implements the TCG specification. This means that the
TPM must contain a secret that nobody -- not even the owner of the
machine -- can know. Of course, nothing can prevent Intel from knowing
this key, so everyone must Standard-trust Intel.

If the secret key were provided by the machine's owner, instead of
generated within the TPM, owners could be sure that Intel didn't know
it. Under this scheme (which is equivalent to Seth Schoen's
owner-override proposal) owners could also break the TCG-trust, because
their machines would be able to act as they intended. Of course, a
voting system that depends on Standard-trusting a vendor of TPMs is not
very secure at all. But let us imagine that this objection could somehow
be overcome, and continue building our TC-based voting system.

Here's how it will work: Diebold builds a voting machine which includes
some (L)GPLv3 components. Let's say Linux, and the GNU C Library, and Qt
(assuming it upgrades). They install this software on the voting
machines, and set up secure or authenticated boot. Then they set up a
server that requires via remote attestation that the voting machines'
software hasn't been altered. Now the server knows that the machines
will operate in the expected manner for the intended purpose.

Governments can TCG-trust the whole system by (a) having the full source
code for every part of the system and (b) remote-attesting both the
client and the server. To analyze this distribution under GPLv3, we need
to ask what the keys are required to install modified versions of the
software:

    The Corresponding Source also includes any encryption or
authorization keys necessary to install and/or execute modified versions
from source code in the recommended or principal context of use, such
that they can implement all the same functionality in the same range of
circumstances. (For instance, if the work is a DVD player and can play
certain DVDs, it must be possible for modified versions to play those
DVDs. If the work communicates with an online service, it must be
possible for modified versions to communicate with the same online
service in the same way such that the service cannot distinguish.) A key
need not be included in cases where use of the work normally implies the
user already has the key and can read and copy it, as in privacy
applications where users generate their own keys. However, the fact that
a key is generated based on the object code of the work or is present in
hardware that limits its use does not alter the requirement to include
it in the Corresponding Source. 

We haven't yet discussed how these systems are upgraded. Because of the
remote attestation protocol, the clients need some way to know that an
update to the server is authorized, and visa-versa. This will be either
a keypair, or a secret key -- it doesn't really matter which. For
security reasons, Diebold should not know this key, since if they did,
they could upgrade the machines without election officials' consent. So
the user -- that is, the government, must already have the key, and it's
not part of the Corresponding Source.

Four hypotheticals down the line, the new Corresponding Source
provisions *still* haven't prevented the use of GPLv3-licensed software
in voting machines. So what situation are people really worried about?
The only answer I can think of is that voting machine manufacturers have
convinced these people that the only way to secure voting is to hand
over all control of the machines to them. Sensible people will see this
security model for what it is -- a naked power grab. 

Copyright © 2004, 2005, 2006 Free Software Foundation, Inc., 51 Franklin
Street, Fifth Floor, Boston, MA 02110-1301, USA Verbatim copying and
distribution of this entire article are permitted worldwide, without
royalty, in any medium, provided this notice is preserved. 
-------

regards,
alexander.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]