grub-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: A _good_ and valid use for TPM


From: Robert Millan
Subject: Re: A _good_ and valid use for TPM
Date: Fri, 27 Feb 2009 21:28:22 +0100
User-agent: Mutt/1.5.18 (2008-05-17)

On Sat, Feb 21, 2009 at 09:10:11PM -0500, Isaac Dupree wrote:
> 
> (warning, I accidentally wrote a long essay exploring when these TPM-like 
> things are evil)
> 
> Okay, suppose we use this means to make them not-so-evil: "Buyer gets a 
> printed copy of the TPM's private key when they buy a board."  I'll call them 
> "(pseudo)TPMs" when they meet GNU demands.  (They could be built with Free 
> software, openly specified hardware designs, etc... why not throw in a bonus 
> :-))
> 
> Alex Besogonov's use case is still possible.  Which should make us suspicious 
> of whether our moral problem is solved.

I don't think there's anything morally wrong with Alex' use case, although
it's not technically unreplaceable either.

> I think my question is: how essential is it that individual people should 
> (always? usually?) have an easy time taking apart their computers, putting 
> them back together, and changing them, and having them behave as "black 
> boxes" 
> to the outside world.

Note that our concern is software, not hardware.  Your microwave oven is
probably a computer too, and it's not a serious problem that it's locked
down and you can't change its software (that would be really dangerous!).

However, if your free programs, which you run on unlocked hardware, suddenly
stop working because they have to interact with third parties (e.g. a web
server) who now demand you use the "Trusted" version, you lose the ability to
modify them, which is one of your basic freedoms.

> I think that we can understand these issues better when we actually have 
> access to Free software (including BIOS) that implements a crypto stack to 
> lock down a computer, on relatively Free hardware designed with some physical 
> security (which is obviously possible, just as the mini-computers called TPMs 
> are designed with some physical security).  Whole computer becomes analogous 
> to TPM in its ability to hide keys etc. -- except that it can be unlocked.  
> Then the issues go back to ones we've been dealing with for centuries, such 
> as 
> whether you should hide a spare key under the doormat in case something 
> happens to the one you normally carry.  Ugh, alright, at least we're back in 
> the physical.

Another interesting example are crypto cards.  They are designed to hide keys,
but they don't do any more than that.  They serve their purpose of
authenticating you to a third party, but can't spy on your memory, and
can't coerce you into running a certain program.

-- 
Robert Millan

  The DRM opt-in fallacy: "Your data belongs to us. We will decide when (and
  how) you may access your data; but nobody's threatening your freedom: we
  still allow you to remove your data and not access it at all."




reply via email to

[Prev in Thread] Current Thread [Next in Thread]