grub-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: A _good_ and valid use for TPM


From: Isaac Dupree
Subject: Re: A _good_ and valid use for TPM
Date: Sat, 21 Feb 2009 21:10:11 -0500
User-agent: KMail/1.10.3 (Linux/2.6.27-11-generic; KDE/4.1.3; x86_64; ; )

Robert Millan wrote:
> On Sat, Feb 21, 2009 at 05:29:34PM +0200, Michael Gorven wrote:
> > On Saturday 21 February 2009 15:51:42 Robert Millan wrote:
> > > On Fri, Feb 20, 2009 at 09:45:28AM +0200, Michael Gorven wrote:
> > > > TPM can be used for good or for bad, but this is the case for
> > > > everything involving cryptography. We don't refuse to use encryption
> > > > algorithms because they could be used for DRM, so why should we
> > > > refuse to use TPM?
> > >
> > > I don't agree with this analogy.  Unlike cryptography, TPMs have been
> > > designed from the ground up to serve an evil purpose.  They *could*
> > > have designed them with good intent, for example either of these could
> > > apply:
> > >
> > >   - Buyer gets a printed copy of the TPM's private key when they buy a
> > > board.
> > >
> > >   - An override button that's physically accessible from the chip can
> > > be used to disable "hostile mode" and make the TPM sign everything. 
> > > From that point physical access can be managed with traditional methods
> > > (e.g. locks).
> > >
> > > But they didn't.
> >
> > Just to clarify, are you objecting to the use of TPM on principle and
> > because you don't want to encourage use of it, or because you think this
> > specific use (trusted boot path) is dangerous?
>
> I can't reply to this question, because it's not just a specific use, it's
> part of the design, of its purpose.  One of the design goals is remote
> attestation, which is a threat to our freedom and is unethical.
>
> If there was a device that behaves like a TPM except remote attestation is
> not possible (e.g. by one of the means described above), I wouldn't object
> to it, and I think the GNU project wouldn't either, but then referring to
> that as "TPM" is misleading.

(warning, I accidentally wrote a long essay exploring when these TPM-like 
things are evil)

Okay, suppose we use this means to make them not-so-evil: "Buyer gets a 
printed copy of the TPM's private key when they buy a board."  I'll call them 
"(pseudo)TPMs" when they meet GNU demands.  (They could be built with Free 
software, openly specified hardware designs, etc... why not throw in a bonus 
:-))

Alex Besogonov's use case is still possible.  Which should make us suspicious 
of whether our moral problem is solved.  Here's how it's possible:

Assume he has access to computer A, which is physically secured.

He buys computer B, which he wants to be able to wander around in the world 
without much effort put into physical security.  He accepts that the hard drive 
contents can be tampered with, but he's willing to pay for a smaller 
cryptographic unit that guards its secrets jealously -- the (pseudo)TPM -- in 
order to "prove" to computer A that computer B hasn't been changed.  Alex is 
the buyer, so he received the printed copy of the private key, and (for 
example) stashes it where computer A is, or burns it.

How does that make computer B free?  Computer A and its buddies (if any) will 
still refuse to communicate with B if B runs different software than what Alex 
had put on it.  Unless there are laws in place, Alex can go on to sell (or 
lease) computer B to someone else, without giving *them* the printed private 
key.  Or in the case of the "override button" and securing the (pseudo)TPM 
with physical locks, Alex could secure it with a lock, and then sell computer 
B without giving the lock's key to person who buys the computer from him.

This look at (pseudo)TPM makes them look to me as a weak way of asserting at-
a-distance your legal/political ownership of a computer.  Even if 
manufacturers cooperate with GNU demands, resulting (pseudo)TPMs are only 
useful for a purpose that is arguably evil... although the political arguments 
might be different between an end-user, a small non-computer-focused 
corporation, a large one, and a computer-manufacturer, asserting ownership.  
And it depends on whether they share such information (with police, public 
internets or other networks) in an attempt to detect fugitive computers.

I think, in the U.S. we can make laws that coerce corporations pretty 
effectively, and individuals don't have much power (most people buy their 
computing devices from mass marketers, large corporations, not people on the 
streets), so -- if we chose -- we could make laws that ensured that 
(pseudo)TPM technology was not used for its most serious moral transgressions.  
(I'm not sure though, because the Internet is so strange.)  Also because most 
software is made by big corporations that can be sued (e.g. Microsoft, Apple), 
and most Free non-corporate software isn't evil.  But I suspect that in a part 
of the world where people know their immediate oppressors more intimately, 
where the mass market (the "free market"?) isn't trusted as much, the TPM 
(well, like any other piece of software or hardware, only worse) could be 
abused to control people.  I could spin that argument one way to say that U.S. 
capitalism is to be desired (at least once we can make the legislators do what 
we want, rather than what big rich corporations want).

I think my question is: how essential is it that individual people should 
(always? usually?) have an easy time taking apart their computers, putting 
them back together, and changing them, and having them behave as "black boxes" 
to the outside world.  How much is physical security acceptable, when you have 
a thing you can lock up and yet still use (until the object decides to stop 
cooperating with you)?  Cryptography simply makes clear that any one piece of 
hardware can be uncooperative (not share everything it knows) and this has big 
consequences.  The big manufacturers have a lot of control here...

Back to Alex's circumstance, you can just make the computer hardware and 
software make it more difficult to (without the right password) install 
different 
software.  The BIOS could be unwilling to boot unrecognized software -- and 
since the BIOS is harder to remove/replace, therefore an attacker can't flash 
the BIOS if they don't have the password, either.  The BIOS could store the 
secret key we use to identify ourselves to third parties (or that we use to 
encrypt the disk -- not stored on-disk because hard disks are usually too easy 
to remove and read), but the BIOS will reveal it happily to a running system.  
And we've already determined that (in Alex's case) we're not trying to protect 
against people who can spy on our RAM or install other treacherous hardware 
(in a non-TPM sense, simply that it does something not quite equivalent to its 
normal expected function), because that already compromises everything.

FSF argument is that in our specific political circumstance, end-user can be 
defined legally and has a history of being used effectively to protect 
consumers, and TPM specifically is an attempt by manufacturers (and whoever 
pressures them) to take some control without putting out the expense, lack of 
modularity/interoperability/etc. to make their hardware actually hard to 
modify and put free software on.  Actually it's also an attempt to be 
deceptive because they're not banning end-user software freedom outright, so 
people and governments don't look as askance at the monopolies that could soon 
be imposed on the end-users, perhaps?

I think that we can understand these issues better when we actually have 
access to Free software (including BIOS) that implements a crypto stack to 
lock down a computer, on relatively Free hardware designed with some physical 
security (which is obviously possible, just as the mini-computers called TPMs 
are designed with some physical security).  Whole computer becomes analogous 
to TPM in its ability to hide keys etc. -- except that it can be unlocked.  
Then the issues go back to ones we've been dealing with for centuries, such as 
whether you should hide a spare key under the doormat in case something 
happens to the one you normally carry.  Ugh, alright, at least we're back in 
the physical.

-Isaac





reply via email to

[Prev in Thread] Current Thread [Next in Thread]