lilypond-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: testing out Docker CI scripts?


From: Han-Wen Nienhuys
Subject: Re: testing out Docker CI scripts?
Date: Sat, 22 Feb 2020 21:18:30 +0100

On Sat, Feb 22, 2020 at 5:23 PM Jonas Hahnfeld <address@hidden> wrote:
> > I would be interested in your feedback.
>
> Not having run any of this, my immediate response would that it's not
> running 'make doc' AFAICS.

For changes to the code, it should be irrelevant to run make doc: the
regression test should cover all the behaviors we care about from a
programming perspective.

The time that David quotes for 'make doc' (~40 minutes) sounds wrong.

$ ls -1 input/regression/*.ly|wc
   1347    1347   55843

$ grep @lilypond $(find  Documentation/ -name '*.*tely' | grep -v
'Documentation/[a-z][a-z]/')|wc
   1828    1938  136964

Building the docs should take about 1.5x the time of building the
regtests. lilypond-book uses a shared database for snippets across all
languages, so there should be neglible additional cost for the rest of
the languages.

This assumes that the compilation time is dominated by LilyPond
compilation, though.

> This will likely explode the time it takes
> to run it, but I think it's a good thing that patchy does it right now.
> When considering the long time for 'make doc', I wonder if saving on ~4
> minutes of compile is worth the complexity of ccache?

The complexity is minimal, and if you are trying to fix a compile
problem for a different platform, getting fast turnaround on a
edit-compile-cycle is huge.

If CI becomes faster and cheaper, it will be easier to have instant
and automatic feedback on all versions of a patch.

> On a related note, I think we should likely decide on our future
> directions with respect to tooling first. Once we settle on Gerrit or
> GitLab (or something completely else) both environments have their own
> possibilities of integrating CI systems. I've been busy last week and
> will be until next weekend, or I would have started a thread to move
> this forwards.

The CI "system" (jenkins, gitlab pipelines etc.) is somewhat
orthogonal from the container setup (defining docker images etc.), so
I think there is not so much double work. The scripts I'm offering
here also have the advantage that you can run them off git commits
from the local development environment.

I'm curious what you come up with for CI tooling. If a full run takes
122 CPU minutes, we'll fall out of the free tier everywhere, and have
to put down some serious money. (eg. travis-ci has a limit of 120
minutes per build; the gitlab free tier is 1000 minutes/month)

-- 
Han-Wen Nienhuys - address@hidden - http://www.xs4all.nl/~hanwen



reply via email to

[Prev in Thread] Current Thread [Next in Thread]