[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Separate trusted computing designs

From: Marcus Brinkmann
Subject: Re: Separate trusted computing designs
Date: Thu, 31 Aug 2006 14:17:01 +0200
User-agent: Wanderlust/2.14.0 (Africa) SEMI/1.14.6 (Maruoka) FLIM/1.14.7 (Sanjō) APEL/10.6 Emacs/21.4 (i486-pc-linux-gnu) MULE/5.0 (SAKAKI)

At Wed, 30 Aug 2006 14:56:05 -0400,
"Jonathan S. Shapiro" <address@hidden> wrote:
> A contract cannot alienate ownership. It can grant exclusivity of use
> for a period of time, and it may contain provisions for indemnification
> of liability. These things do not alter ownership. They do not alter
> certain conditions pertaining to seizure.
> This aside, I disagree with an assumption that you appear to be making
> here. It appears to me that you deny the possibility that a machine may
> be contracted on a non-exclusive but isolated basis.

I am disappointed.  At your request, I have compiled a long essay on
this matter, including definitions of the words "ownership" and
"contract", based on definitions suggested by Hegel.  But apparently,
you have either not read them, or forgotten about them, or choose to
ignore them.  Sticking to these definitions, alienation of ownership
means exactly what you describe, and also the other examples above
which you do not understand as alienation of ownership.

Given that my definitions (which agree with the common understanding
of the terms) have not been challenged, I am upholding them.  If you
disagree, you will have to provide your own definitions.


> Such verification does not contradict any principle of freedom held by
> the FSF that I can see. This is not a case of the service provider
> enforcing against the customer/user. This is a case of the customer
> being able to verify the compliance of the service provider.

If the principles of freedom held by the FSF are violated in this
example or not depends on who distributes which FSF-licensed work to
which other parties under which conditions.  This is a non-trivial
question, exactly because the "trusted computing" model of multi-party
computing diffuses ownership.

You can trivially construct examples of "trusted computing"
applications which do not violate the FSF's principles.  In
particular, any application which does not use free software would be
such a case.  This does of course not imply that there are no other
reasons to reject such applications in general.

> I claim that Marcus's essential concern here is not about the features
> of any particular operating system, but about the balkanization of
> content.

This is one of my concerns, but by _far_ not the only one.  I have
expressed many other concerns in my essay.  Ignoring these concerns is
your choice, however, doing so does not remove them from my list, nor
does it invalidate them.

My concern goes right to the heart of the "trusted computing" model,
which is the assumption that information shared with other people can
and should be proprietarized.  My concerns are the various social
implications of an attempt to do so.

The rest of your argument is based on broad assumptions that the
struggle is already lost.  Different political circumstances may
require different political action.  That is a trivial truism, and one
of the reasons that the LGPL exists, for example, or the "system
component" exception in the GPL.  Furthermore, what is the best
strategy for action is a deeply personal question, that everybody has
to decide for themselves.  There are numerous options, including those
that you mentioned, but also completely different ones.

However, I also think that there is a fallacy in your argument about
choice.  Choosing between different systems that implement the same
"trusted computing" policies is not really a choice on this matter.
All such systems will be equally "useless" for the user.  You may say
that these systems still compete in the area not subjected to "trusted
computing" policies.  However, with your assumptions, these areas will
be marginal, because "trusted computing" policies are very invasive
and take over the whole computer system, hardware and software.  Case
in point: No currently upcoming system (next Playstation, next Windows
version) will be able to play the next generation movie DVDs out of
the box (interestingly, I heard that some stand-alone players will
support them, for non-technical reasons).

Nevertheless, I agree with the core of your argument, that there can
be a moral case in support of a morally offensive system to ease the
damage inflicted by it on its victims.  However, I think the argument
is not quite as strong as you present it, certainly not in this case,
where I think superior options are available.

> > >From the current GPLv3 draft:
> > 
> > "Some computers are designed to deny users access to install or run
> > modified versions of the software inside them. This is fundamentally
> > incompatible with the purpose of the GPL, which is to protect users'
> > freedom to change the software. Therefore, the GPL ensures that the
> > software it covers will not be restricted in this way."
> I am not aware of any general-purpose computer that is "designed to deny
> *owners* access to install or run modified versions of the software
> inside them". Perhaps some are being developed. This description
> certainly does not fit the TCPM-based technology that is being
> implemented in PC's.

The error in logic is that you think the people who bought TCPM-based
technology are the owners of the machine.  My analysis showed that
this is not the case, using suitable definitions of the terms.

One of the cases that triggered these developments is the TiVo device.
You may not consider it as a general-purpose computer, but it
certainly contains general-purpose computer hardware and a
general-purpose operating system, and people naturally want to exploit
this to their advantage.  This is actually part of my argument:
Computers take over more and more of our personal space.  This means
that it is increasingly important for people to retain the ability to
personalize them (full argument in my essay).

> However, denying this to *users* is possible *without* TCPM. It is
> simply a matter of setting a BIOS password to prevent OS reinstall. This
> is true whether or not the OS supports TC.

BIOS passwords are a joke, and you know it ;)


reply via email to

[Prev in Thread] Current Thread [Next in Thread]