axiom-developer
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Axiom-developer] security and trust in Axiom (was: Bootstrapping)


From: Bill Page
Subject: [Axiom-developer] security and trust in Axiom (was: Bootstrapping)
Date: Thu, 10 Nov 2005 12:10:03 -0500

On November 10, 2005 10:43 AM C Y wrote:
> ... 
> The problem is you have to trust not only the person you are
> getting the binary from, but that their computer is secure and
> the files haven't be replaced, that the original computer it was
> compiled on was uncompromised, that the binaries used on that
> computer to make the binary you are downloading don't contain
> any earlier backdoors they can pass on...
> 
> From a point of fact, I don't think any true (e.g. human) 
> bootstrapping has occurred within recent history, so in a sense
> all software today relies on some unknown binaries from the past.
> This sounds like a good project for the government, actually -
> establish open software and hardware standards to create a
> system that, with the minimum necessary work from a human being,
> can start the binary chain reaction going.  Of course even if
> I somehow propose that to my congressman I'll probably just get
> a weird look...

Although some people might be a little paranoid about the
motivations of both government and military organizations,
I suppose it shouldn't hurt my argument here much if I promote
the work of an organization associated with the people for
whom I work.

The Canadian Defense Research Centre Valcartier has an ambitious
project called MaliCots, directly related to this subject:

http://www.drdc-rddc.gc.ca/researchtech/malicots/home_e.asp
http://www.dodccrp.org/events/2003/8th_ICCRTS/Pres/track_7/1_1400Charpentier
.pdf

The orientation is mostly towards the integration of Commercial
Off the Shelf (COTS) software, i.e. mostly closed source software,
within secure or trusted systems, but they also have a strong
interest in open source software for obvious reasons.

I don't consider these people paranoid at all but they do take
their job very seriously.

I expect that there are other organizations with similar
motivations in various countries around the world.

> ... 
> > > As one of my colleges said,
> > > 
> > > For a sysadmin, the absence of paranoia is called professional 
> > > incompetence.
> > 
> > I think your colleague does not have a clear understanding of
> > security.
> 
> I think the above statement is essentially shorthand acknowledgement
> of the following:
> 
> a)  when a computer is connected to the internet, anyone in the world
> can launch an attack against it.
> b)  given the number of people on the net, there will be bad actors. 
> the statistical chances of this being the case approach unity quite
> closely.
> c)  there is a significant, non-zero chance that my machine will
> come under attack

The proper question here is not how to make the chance zero,
but rather how to acceptably manage the risk. Life is a risk.
We need to balance that against the need to interaction.

> d)  any part of my system not personally verified by myself is an
> unknown, and I cannot state with certainty that it contains no
> vulnerabilities.
> e)  any binary not created on site from source code cannot be
> inspected, and therefore is only as trustworthy as the ENTIRE web
> of trust behind its creation.

That is not really true, as my reference to the MaliCots project
above shows, although granted lack of certified original source
code does make it more difficult.

> f)  since functionality must be provided, the best available measures
> are to reduce the required bootstrapping binaries to a minimum,
> reducing the number of potential problems.  Hence, the appeal in the
> open source world of only relying on gcc for bootstrapping events -
> if a problem is ever found, that makes only ONE piece of software
> that has to be redone the "hard way".  Then everything else can be
> rebuilt automatically.

I think this strategy is wrong. Biological analogies demonstrate
that robustness is a function of the variety and diverseness in
a population. In other words, it is wrong to keep all of our
(chicken :) eggs in one basket. It is interesting to notice to what
extent the Internet and free software is also adopting this sort
of strategy. Online disk storage is cheap, bandwidth is rising.
Open source code is duplicated in many places all over the world.
Open source allows a great diversity and choice among alternative
software. This is something that we should encourage.

> 
> So, if Axiom (at least SOME version of Axiom) can be built with just a
> working GCL, GCL can be built by gcc.  So Axiom only relies on gcc for
> bootstrapping, which is a common reliance in the open source world.

No. If anything we should be prepared to build open source programs
like Axiom many different ways and with some variations. That is,
I think, the best way to ensure that Axiom will survive over a
long period of time.

> ...
> > I think we should take some steps that we are not taking now to
> > help ensure that what we distribute is trusted by Axiom users.
> 
> I agree here - signed binaries might be a very good idea.   I'll
> admit I don't know too much about that myself (I admit I'm too
> trusting) but its definitely the "right" way to do things.  I'll
> have to look into what the modern ideas on that issue are.
> 

I think it would be great if someone here could take the lead
to make some recommendations about what we should do to make the
distribution of Axiom source code and binaries more trustworthy.
 I know there are some fairly extensive features within tla for
signatures. And Tom Lord (the main tla developer) seems to have
a strong interest in this subject.  We can also create signed
hash codes for the downloadable binaries. The trick it seems,
is how to introduce such procedures without making the whole
process too awkward for developers. We need some help with this.

Regards,
Bill Page.






reply via email to

[Prev in Thread] Current Thread [Next in Thread]