[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Separate trusted computing designs

From: Jonathan S. Shapiro
Subject: Re: Separate trusted computing designs
Date: Wed, 30 Aug 2006 14:56:05 -0400

On Tue, 2006-08-29 at 13:00 +0200, Marcus Brinkmann wrote:
> At Tue, 29 Aug 2006 10:41:22 +0200,
> Christian Stüble <address@hidden> wrote:
> > Am Donnerstag, 17. August 2006 09:18 schrieb Marcus Brinkmann:
> > > I asked for use cases that have a clear benefit for the public as a whole
> > > or the free software community.
> > I personally would like to be able to enforce my privacy rules even on 
> > platforms that have another owner.
> If you can enforce a property about a system, then it is not owned
> exclusively by another party.  That's a contradiction in terms.
> What you can do is to engage in a contract with somebody else, where
> this other party will, for the purpose of the contract (ie, the
> implementation of a common will), alienate his ownership of the
> machine so that it can be used for the duration and purpose of the
> contract.  The contract may have provisions that guarantee your
> privacy for the use of it.
> But, the crucial issue is that for the duration the contract is
> engaged under such terms, the other party will *not* be the owner of
> the machine.

A contract cannot alienate ownership. It can grant exclusivity of use
for a period of time, and it may contain provisions for indemnification
of liability. These things do not alter ownership. They do not alter
certain conditions pertaining to seizure.

This aside, I disagree with an assumption that you appear to be making
here. It appears to me that you deny the possibility that a machine may
be contracted on a non-exclusive but isolated basis.

A concrete example:

Just as I could contract to lease you a physical machine, I could
contract to lease you a *virtual* machine. In such a case, your reliance
on the isolation of the virtual machine is predicated on two

  1. Your faith in the contract (more precisely: in me as the other
     party to the contract)

  2. Your confidence in the VM implementation's isolation features.

Given that such validation is technically feasible, it is *completely*
reasonable for you, the customer, to require that you be able to
independently validate the software load that is running on the leased
machine. That is, it is reasonable to "trust, but verify".

Such verification does not contradict any principle of freedom held by
the FSF that I can see. This is not a case of the service provider
enforcing against the customer/user. This is a case of the customer
being able to verify the compliance of the service provider.

> > > I am always impressed how easily some fall to the fallacy that the use of
> > > this technology is voluntarily for the people.  It is not.  First, the use
> > > of the technology will be required to access the content.  And people will
> > > need to access the content, to be able to participate in our culture and
> > > society.  All the major cultural distribution channels are completely 
> > > owned
> > > by the big industry, exactly because this allows these industries to have 
> > > a
> > > grip-hold over our culture.  There is an option for popular struggle
> > > against this, but it will require a huge effort, and success is by no 
> > > means
> > > guaranteed.
> > I did not talk about TC in general, but about the "privacy-protecting 
> > agent".
> I am not sure what you mean by that term.  The crucial point here is
> that TC removes the choice from the people which software to run.

I claim that Marcus's essential concern here is not about the features
of any particular operating system, but about the balkanization of
content. I definitely think that this is a good thing to worry about,
but it is not obvious that this concern should have any impact on the
long-term design of Hurd.

For the sake of discussion, let us assume that we build an operating
system -- I shall call it "Hurd" -- that does not support TC. Hurd does
not make these features available to users or applications. Let us
further assume that there exists in the world a popular operating system
-- I shall call it "Windows" -- that *does* support TC.

Observe that the vast majority of machines will run Windows, and that
this is sufficient to ensure that balkanizable content will be
balkanized. Further, this is true whether or not Hurd supports TC.

So: there exists content in the world that is going to get balkanized.
When that happens, the decision that users will really face is between
two options:

  1. They can enable TC to gain access to that content.
  2. They can elect not to enable TC and decline access to that content.

This choice is not a choice about operating systems. Choosing Hurd is
entirely equivalent (in this regard) to choosing Windows with TC
disabled or choosing Linux with TC disabled.

BUT, by denying support for TC, Hurd elects to make a choice as well. It
forcibly divides users into two mutually exclusive camps:

  1. Users who require access to balkanized information in order to
     participate in culture and society in all of the ways that Marcus

  2. Users who can run Hurd.

The problem here is that there are unfortunate choices in the world.
People who have to deal with balkanized information may nonetheless want
to support freedom. Even if they are unable to make a black and white,
100% commitment, support for freedom is still support. By forcing those
people off of the Hurd we deprive them of choice. We say, in effect:
"Unless you are willing to turn your back on culture and society we are
not interested in dealing with you."

I do not believe that forcing this choice is right, moral, or ethical.

So what (in my opinion) is the moral and ethical path for the Hurd?

In my opinion, Hurd should delay support for TC as long as it can. It is
not consistent with the Hurd view of freedom to encourage or accelerate
or facilitate the arrival of TC and the balkanization of information.
This much seems very clear and obvious.

However, at some point the world will cross an inevitable threshold
concerning TC, and at that point I believe that the moral imperative
will swing the other way. In a world where TC is required to participate
in culture and society, the moral obligation of the Hurd will be to
maximize those freedoms that people can exercise by running
non-proprietary systems.

That point is not yet here, but when it arrives I believe that it will
be immoral and unethical for Hurd (and FSF) to *fail* to support TC.
Speaking out about the risks and costs and problems of TC will remain
appropriate, but polarizing the freedoms of users is not.

> However, it seems that the free
> software community forcefully rejects this technology, and implements
> various means to protect itself from these developments.

The free software community does not hold a monolithic view on this

> >From the current GPLv3 draft:
> "Some computers are designed to deny users access to install or run
> modified versions of the software inside them. This is fundamentally
> incompatible with the purpose of the GPL, which is to protect users'
> freedom to change the software. Therefore, the GPL ensures that the
> software it covers will not be restricted in this way."

I am not aware of any general-purpose computer that is "designed to deny
*owners* access to install or run modified versions of the software
inside them". Perhaps some are being developed. This description
certainly does not fit the TCPM-based technology that is being
implemented in PC's.

However, denying this to *users* is possible *without* TCPM. It is
simply a matter of setting a BIOS password to prevent OS reinstall. This
is true whether or not the OS supports TC.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]