[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: fork, trivial confinement, constructor

From: Marcus Brinkmann
Subject: Re: fork, trivial confinement, constructor
Date: Wed, 14 Jun 2006 12:59:50 +0200
User-agent: Wanderlust/2.14.0 (Africa) SEMI/1.14.6 (Maruoka) FLIM/1.14.7 (Sanjō) APEL/10.6 Emacs/21.4 (i486-pc-linux-gnu) MULE/5.0 (SAKAKI)

At Tue, 13 Jun 2006 23:24:31 -0400,
Eric Northup <address@hidden> wrote:
> > The constructor is
> >             explicitly designed to allow this type of control by the
> >             programmer (or system administrator) over the user.
> I find this statement to be misleading.
> The constructor is designed to allow two *programs* (which might be
> mistrusting of each other) to compose some of their authorities and
> form a third program.  *Either* of the two parent programs can also
> decide not to allow the construction if the resulting program would be
> unconfined, so the protection is symmetric, rather than one-sided.

It's symmetric with regards to confinment.  It is not symmetric with
regards to control.

> This is a statement about programs, not about humans; I am not at all
> concerned with programs' freedom.
> I believe that the constructor is compatible with preserving freedom
> for humans; if it turns out not to be, then those are bugs in the
> design and they should be fixed.

I suspect that you use the term "compatible" here in a very narrow
sense.  In the way of "one can build a system using this feature that
preserves freedom for humans", rather than: "this feature will
naturally lead to a system which preserves freedom for humans".

My position is that the encapsulated constructor mechanism is a
security threat, because it attacks user freedom.  My arguments for
this position have been carefully laid out in "Ownership and
Contracts", and have not been challenged yet.  Jonathan disagrees, but
his disagreement at this point is just dismissal of the concerns and

The consequence, however, of taking the concern seriously is to apply
the standard practices of secure computer system design to take this
security threat into account and protect against it.

> I understand that there are many freedoms which the HURD's design must
> be able to protect.  But I think that, by using an overly simple
> example, some other important considerations were missed.

I am not sure to which example you are referring.
> Consider UNIX systems.  If the process being run has the set-UID bit,
> then the parent process P does not have the right to debug the child.
> The analogous scenario in a system with a constructor would be where P1
> had given the meta-constructor some additional capabilities.  The
> trivial confinement design as stated can't provide even the same
> securities as UNIX!

I think this has already been discussed and answered, but here is the summary:

First, Unix does not do memory resource accounting at all.  Thus, it
is not straightforward to decide what is the closest analogous
scenario to a suid exec in a system with resource accounting.  There
is a degree of freedom of choice here that leads to different results.

Note that the user is not allowed to kill a suid program that is
currently executing.  Thus, it can not run on non-durable resources,
like the user's space bank.  Thus, I think that the appropriate
replacement mechanism in a system with resource accounting is a
service provided by the "owner of the suid file" (in Unix terms) which
runs on the "owners" resources, and which provides an interface to the
user which supports the task users would want to do with suid programs
in Unix.  Note that this design does not provide any confinement
guarantees to the user.  Neither does Unix.

You can do this with a constructor.  But any ol' service would
suffice.  We do not need to specify what type of service it is, we can
leave this to the owner of the service to figure out.

For system services, I have also said I am willing to make an
exception.  This is because, as you pointed out, not only the
immediate parent has complete control, but also all grandparents, and
these grandparents may delegate some of that control to other
processes in the system.  The net result is that various system
processes can conspire to take some of the user's memory and deny the
user access to it.  But this is not in violation to my goals, because
it is the system which provided the memory to the user in the first
place.  Furthermore, in case the user and the machine owner are one
and the same person, it doesn't even make a difference.  If they are
not, then to start using a machine owned by somebody else already
requires special considerations, so I don't think the situation is
worse than without such an exception.

> Now, let me be clear: I am not suggesting that the HURD should
> encourage system administrators to be overly restrictive of their
> users' freedoms*.  And I definitely agree that the HURD should not
> support DRM.  But it is important to not throw the baby (usable
> security) out with the bathwater (DRM, unreasonable restrictions of
> users freedom, etc).

Again, please be specific about what you mean by "usable security".
As you phrase it, its only effect is derogatory.  I do not agree that
the constructor mechanism is synonymous to "usable security", and all
systems that lack it have "unusable security" or "no security at all".

If you provide me a security goal that you think is worth supporting,
I can tell you if I disagree with the goal, or otherwise, I can try to
figure out how to implement this goal in my system structure.

> >             Therefore I am not so convinced that we want a constructor. It
> >             gets in the way of debugging, for example, and it doesn't really
> >             give any gain.
> The question of how to permit convenient debugging, without
> accidentally introducing security vulnerabilities is (as far as I am
> aware) a rather under-explored problem space.  Many designs are
> possible; the one I'm about to propose is just a suggestion to start
> discussion.
> It seems to me that a reasonable policy would be that a program should
> be able to get a debug capability to a "victim" program if:
> -The program can provide capabilities to the space bank and CPU
> schedule from which the victim is running.
>    AND
> -The program can provide copies of all the capabilities that were
> packaged into the victim's constructor by the meta-constructor.

Well, you are assuming that a requestor should only be allowed to
inspect programs with the permission of the creator.  My position is
that a user should be allowed to inspect programs with the permission
of the law.  (In fact, I am already compromising, by providing
privacy, ie isolation between child processes.  Law does not provide
absolute privacy, but the amount of privacy law provides is quite
strong and comes pretty close to absolute isolation, much closer
anyway than to absolute control over how bits are copied).

For example, in many countries, reverse engineering is legal for
specific purposes, like compatibility to competitors product.  These
exceptions are considered important by society to preserve a fair
market place.  In the system you propose, such exceptions by law would
essentially become useless, because the law would give people a right
that they can't technically exercise.

The challenge is not to show how the restrictions your system imposes
can be overcome voluntarily by all involved parties.  That is not a
big problem.  The challenge is to find a system design that is by
design in compliance with existing laws.  This is extremely
challenging, as the laws allow you to break the law sometimes, for
example in cases of emergency (which may be decided at a humans
discretion and later vindicated in court).

I think it was Pierre who suggested a feature where one can detect if
a page was written to by somebody else.  This may be an idea worth
pursueing: Instead of restricting access, one could increase
transparency and monitoring.  This has apparently worked well for the
incompatible time sharing systems at the MIT AI lab in the 1970s.  I
am not saying that this is a viable system design in todays threat
scenarios, but I think that it may be a viable design for suspicious
collaboration in some cases, especially if combined with a social
infrastructure (personal contacts) and/or newer technologies (like
versioned file systems etc).  Wikipedia operates along similar
principles.  I understand that these approaches are philosophically
quite different from what you are working on.  My point is that
alternative models exist, that, while having their own flaws, can work
in practice.

> I often run shell sessions inside Emacs (or Eclipse, if I'm doing Java
> work on a fast computer).  Now, Emacs is a great program.  But it also
> obeys (presumed) hostile code -- there are a bunch of .el modules I've
> downloaded and installed.
> So, if I'm running on a system where parent programs have the unlimited
> authority to debug their children, then I'd have to stop using Emacs'
> shell features.  (Assuming, of course, that we're using an OS where
> normal programs don't already have full read+write access to a user's
> home directory and debug authority for all programs started by a user.)

I don't think that you solve these issues in your system design
either.  The emacs program would require the cummulative authorities
that you have to provide to the programs you start from its shell.  In
other words, your emacs session will already have a concentration of
excess authority.  Furthermore, you have to someway control the I/O
channels of your program.  You also need features to identify the
program emacs actually started, and you need to be able to rely on
these features to not betray you.  This is mainly a user interface
challenge, but also a technical design challenge.

Once you solved these problems, and I can roughly see a couple of ways
to do so, I don't think we are talking about good ol' emacs anymore :)

But anyway, if you really want to do so, there is nothing in the Hurd
that stops you from implementing an opaque space bank and a
meta-constructor and run it under your user session, and use it for
your own program instantiations.  You might not have much success in
convincing others to use it to rely on confinement guarantees, but
that is outside of your emacs example.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]