[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Mingw-cross-env-list] Broken Links
Re: [Mingw-cross-env-list] Broken Links
Thu, 17 Mar 2011 21:51:56 +1100
On 14 March 2011 01:34, Volker Grabsch <address@hidden> wrote:
> Daniel Stonier schrieb:
>> So, what's a practical way of dealing with this? Possibly:
>> 1) Bugfix broken download links in the stable release as soon as they
>> happen rather than just pointing people to the development branch.
> Such bugfixes require some time, as the new download URL will have
> to be stable - otherwise we'd fix it over and over again, which
> would quickly become a maintainance nightmare.
> If you find some time to help out here, by all means please do so!
> Nevertheless, we'll have to release more often, in particular
> immediately after a broken link.
I think releasing more often would tend to introduce more instability.
Even if we could do it instantly, that would mean every user has to
upgrade their environment (along with any other other new packages)
just for one or two trivial changes. I imagine a predictable release
schedule would be preferable, provided we can find a way to do
lightweight minor updates.
If we frame the issue as a way to apply source updates rather than
thinking about releases/tarballs/mirrors, we have a few more options.
I'm just testing some ideas so bear with with me.
We could start with mercurial release branches. Say we found out that
postgresql had moved their old tarballs to an archive site. We create
a branch (2.18-release), and fix the urls on that branch. At this
point we could make the decision about mirroring it ourselves, or
simply changing to the new location (depends on the type of change I
For previous releases, we send a:
link in an email. This is instantaneous and doesn't require users to
fully rebuild their environment or us to rush out releases.
Of course, wget isn't ideal. We could do something in the Makefile to
poll for stable updates that we post on the website and fetch the
corresponding files (like we do for packages). We could even treat the
entire src directory as a package in it's own right, but we're getting
Mercurial repos are relocatable, so our next release could simply be a
tarball of the repo that can update itself with a simple hg pull. I've
tested a simulated release and update using the steps below  and it
seems to work fine. We could wrap the hg commands in the Makefile and
do something smart with hgrc so that people don't need to know about
mercurial - but you get the general idea. The only advantage this has
over simply cloning is that users won't need to have mercurial
installed until there is a problem that affects them.
I don't think mercurial is an unreasonable requirement for a
developer, maybe future "releases" are simply clones of a branch?
>> 2) Have a repository for mingw that holds sources for packages that mingw
> See above. This already happens by trying to find stable mirrors
> for each package. We're currently quite good at that, but we don't
> have it for all packages. So a new central mirror for at least
> those packages might be helpful.
> However, I think that having one big mirror for _all_ packages would
> be overkill, because the requirement to upload everything to that
> point would slow down releases. Also, big packages like Qt already
> provide very fast and stable download sites.
> Nevertheless, feel free to provide such a mirror in case you think
> it is necessary for your project. :-)
Probably inflating this a little, but (technical issues aside)
unofficial mirrors break my distributed trust model. I wouldn't like
to lose the transparency of downloading from each project's site. For
most cases of broken links, it's simply a re-organisation which can be
handled in a timely fashion by releasing updates as above. The
multiple urls seem to be sufficient for unstable mirrors.
>> Can we find a practical resolution to make this better?
> First of all, I fully agree that we need more timely "bugfix"
> releases that fix download URLs etc. Since those don't require
> much testing, the only bottleneck is missing automatization.
> Human intervention is mostly needed because
> A) Freshmeat doesn't provide their automatic release
> API anymore,
> B) an upload to Savannah takes up to 24 hours until it
> is available on all mirrors.
> Here, point A) wouldn't be that bad if it didn't require
> the waiting period introduced by B). For B), I'm still
> looking for a good solution. 
The automation is only really required if we're thinking about
releases and tarballs, since we can do updates without these, it
becomes less of an issue.
Anyway, those are some of my thoughts.
 see https://bitbucket.org/tonyt/mingw-cross-env/overview
#from cloned current mingw-cross-env
hg up 2.18
hg branch 2.18-release
cp dist/web/index.html doc/
rm -rf dist
hg rm tools
hg commit -m'prepare release 2.18'
#push to bitbucket
hg push --new-branch
#for mercurial >= 1.7 to maintain backwards compatibility
hg --config format.dotencode=0 clone -b 2.18-release --pull .
hg clone -b 2.18-release --pull . mingw-cross-env-2.18
#could do something smarter with hgrc
cp .hg/hgrc mingw-cross-env-2.18/.hg/
tar cvf - mingw-cross-env-2.18 | gzip -9 >mingw-cross-env-2.18.tar.gz
rm -rf mingw-cross-env-2.18*
hg up default
hg up 2.18-release
...fix postgresql url
hg commit -m'package postgresql: fix download url for 2.18-release'
On other machine:
download initial release from:
#hg pull -u also works since branch is remembered
hg pull -b 2.18-release -u