[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Confinement (even with TPMs) and DRM are not mutually exclusive

From: Bas Wijnen
Subject: Re: Confinement (even with TPMs) and DRM are not mutually exclusive
Date: Wed, 7 Jun 2006 11:48:36 +0200
User-agent: Mutt/1.5.11+cvs20060403

On Wed, Jun 07, 2006 at 01:43:03AM -0400, Jonathan S. Shapiro wrote:
> On Tue, 2006-06-06 at 23:13 +0200, Bas Wijnen wrote:
> > Now when the program is being run as it should be by the user, everything
> > is fine.  But what happens if the user will run it "improperly", for
> > example in a debugger?  Then the guarantee is no longer valid.  However,
> > that doesn't mean that it's a wrong thing to do.  In particular, when the
> > user starts a program, that's the user's business, not the
> > administrator's, and certainly not the programmer's.  If the user wants to
> > run the program in a debugger, then it mustn't start protesting.  It must
> > do what the user wants...
> In my view, this is completely wrong.

I thought so. :-)

> The problem here is liability and lawsuits: if something goes wrong,
> there is no evidence to decide later whether the program was executing
> legitimately or not. Neither the developer nor the user is adequately
> assured that a robust determination of whether liability might exist
> under contract is possible.

This is always the case.  The only solution to this is that the programmer
delivers you a computer to run the program on, and you're not allowed to open
this computer.  This is essentially what TC does, except that the user must
pay for the computer directly instead of via the programmer.

Let's be very clear about this.  These kind of guarantees are not possible
without TC.

Since I wasn't talking about TC, this wasn't relevant to my e-mail, but it's
interesting anyway, so let's continue the discussion. :-)

> Alternatively, it is possible and legitimate to establish more binding
> and checkable relationships. For example, a developer may say to a
> customer "I have tested this program in a certain environment, and I am
> willing to accept significant liability if it fails while running in
> that environment. However, *because* I accept liability, I will only
> permit you to run the program in a way that I can validate. Either I can
> validate it, or I will not accept liability." [This does not stop us
> from having an alternative, non-liable version; we are merely
> establishing and auditing the conditions under which liability applies.]

Right.  And there is nothing wrong with this in principle (except that
mechanism are needed in the system that I'd like to avoid, but that's a choice
I make).

I think programmers aren't going to accept liability, so the gain of this
functionality isn't actually there.  But that's yet another discussion.

> The potential to mechanically audit the conditions of execution in order
> to allow a simple determination of whether liability exists is critical.
> Simultaneously, the ability to guarantee that a program is only executed
> as intended is absolutely necessary in order for a developer to consider
> accepting liability. In questions of liability, it isn't enough to check
> the interface. You rely on the implementation as well.

Of course.  And this is all possible with TC, I know.

> From the standpoint of design philosophy, it is important to me to
> design a system in which developers and users can be held accountable
> for their contracts.

I understand why this is attractive.  However, I also see that it isn't how
things work in the world.  If Microsoft would have such a system, what do you
think they'd do with it?  Accept liability?  No way.  Force a load of crap on
users?  Sure thing.  I assume you know the license agreement of media player
gives Microsoft essentially the right to take over your computer (look at your
documents, installed programs, uninstall programs, install new programs,
everything).  This type of technology actually makes it possible for them to
use that.

Remember Sony's rootkit?  It didn't really work, because people could actually
check what it did.  What if they can't anymore?  Is Sony going to accept
liability for things that go wrong?  Of course not.  But they will take over
your computer, too, and nobody will notice, because everything is protected
against those annoying people called "users".

> The problem with DRM isn't the mechanism. The problem is that it is a badly
> structured, one-sided contract.

Very well.  I'm an anarchist.  I believe that the best world possible would be
one without police or army forces (but with governments, by the way, just
without armed power).  I also see around me that the world isn't ready for
such a system.  Therefore I am not in favour of dismantling all police forces,
starting tomorrow.

I see the same situation here: In a world as you envision it, TC could be used
for some good (well, at least acceptable) things.  However, in the current
world we can be very sure that it will be abused on a large scale, and things
will be worse than without TC.  I say that in such a case, it is a very bad
idea to implement TC, "because it is needed for something good".  Start with
making the world better in a way that it can handle it.  Until then, stay away
from TC.  (As I don't expect to see the day that the world is ready for
anarchism, I also don't expect to see the day the world is ready for TC.  Just
to make clear that this is not something to spend some time on and then it's
finished.  Although the time would probably be very well spent. :-) )

> The reason that ActiveX is such a disaster is that it relies on
> interfaces. You install an ActiveX control. It uses 15 others. One of
> these gets upgraded in a way that breaks the interface contract. In
> consequence, a seemingly unrelated control misbehaves in a way that the
> user can observe.

That has nothing to do with programmers and users.  That's programmers
designing lousy interfaces which need to change, or programmers using
undocumented (and changing) features, or both.  You can say this won't happen
if programmers are liable, but it also won't happen if programmers just write
proper code.  In particular, this is not a problem of a changing
implementation with an unchanging interface.  This is a problem of a changing
interface (possibly undocumented).

> My goal is accountable systems.

Then please work on making the world ready for them.  You say you need TC, but
the world currently can't handle TC.  This means you probably won't get to
even start with implementing accountable systems.  It also means you will make
the world a better place, not worse.


I encourage people to send encrypted e-mail (see http://www.gnupg.org).
If you have problems reading my e-mail, use a better reader.
Please send the central message of e-mails as plain text
   in the message body, not as HTML and definitely not as MS Word.
Please do not use the MS Word format for attachments either.
For more information, see

Attachment: signature.asc
Description: Digital signature

reply via email to

[Prev in Thread] Current Thread [Next in Thread]