[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Broken dream of mine :(

From: Bas Wijnen
Subject: Re: Broken dream of mine :(
Date: Tue, 22 Sep 2009 00:01:02 +0200
User-agent: Mutt/1.5.18 (2008-05-17)

On Mon, Sep 21, 2009 at 07:45:00PM +0200, Michal Suchanek wrote:
> If there is a call form the internet and it is received it then it
> means that some process registered a listen port and it has to pay for
> the memory and cpu time required to receive the call. Otherwise it
> will not be received.

Yes.  So the server is paying resources for receiving the call.  This is
not what we would want: for internal communication, normally the caller
will pay for making the call.  However, this is technically impossible
in the case of networking.  My statement is that a public server will
listen on the network anyway, and so pay for calls from those clients.
Even if it would be better when internal callers to it would pay for
their own calls, it isn't really required: the server is expecting to
pay for some callers anyway, a few more don't hurt.

> At the time your "toy system" was first announced I got the impression
> that the situation is different and that the new kernel will have
> resource accounting very different from that of Coyotos, possibly
> avoiding opaque memory completely.

I was hoping that this would be possible, although I did realize at the
time that it would be trivial to implement it anyway.  Now I am a bit
further in defining driver interfaces and see that some drivers will
(probably) need opaque memory, so I shall implement it for socially
acceptable situations (meaning system drivers).

> > So:
> > technically opaque: opaque through technical protection
> > socially opaque: opaque through social agreement between people
> In your definition I am missing one part that you seem to imply.
> socially opaque: opaque through social agreement between people in
> situation where technical means for ensuring that the memory is indeed
> opaque is missing or not used.

While strictly speaking I didn't mean to imply that, I do usually mean
"socially but not technically opaque" when saying "socially opaque".

> drm is then a situation where memory is both socially and technically
> opaque, with the technical conditions enforcing a stricter policy than
> would be possible to uphold through a social contract only

Drm is a situation where the information provider doesn't trust the
user, and thus a social contract cannot be used for protection.  This
means there is no social contract.  The memory is in fact technically
but not socially opaque, IMO.  This means that breaking the protection
is in fact an acceptable thing to do.  Why should I have to honour rules
from someone who doesn't trust me, which were not negotiated?  Of course
it is usually illegal to do this (or at least to copy the information).
But IMO it's not immoral.

> > IMO social agreements are very important when it comes to what a
> > computer should and shouldn't do.  In particular, for a computer to
> > completely do what the user wants (that's what Iris aims for, and it's a
> > social goal at the core), it must also allow to lie for the user.
> The problem does not lie in the technical features of the system here.
> The problem would possibly lie in the availability of technique that
> ensures that the system cannot lie, and a pressure on users to employ
> this technique in their systems.

I do however have a weapon against them. ;-)  Iris is licensed under GPL
3, which means that changing it to use TC is only allowed if the
encryption keys are published with the source.  I'm not sure how well
this weapon works, but I would probably consider trying to use it when
the time comes.

> By implementing a more secure and reliable system you further the
> usability of the system for everyday tasks and protection from viruses
> an software errors but you also further the confidence in verification
> because with a more reliable system it's harder to break the
> verification.

This is true, of course.


Attachment: signature.asc
Description: Digital signature

reply via email to

[Prev in Thread] Current Thread [Next in Thread]