lmi
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [lmi] Dependencies and submodules (was: Tests PR 174)


From: Vadim Zeitlin
Subject: Re: [lmi] Dependencies and submodules (was: Tests PR 174)
Date: Thu, 25 Mar 2021 14:00:01 +0100

On Thu, 25 Mar 2021 10:10:39 +0000 Greg Chicares <gchicares@sbcglobal.net> 
wrote:

GC> Let's consider some other scenarios:
GC>  - using 'git bisect' to find a regression
GC>  - using 'git switch --detach' to test behavior as of some prior SHA1
GC> In such scenarios, it would seem that there's no easy general answer,
GC> because rewinding a library submodule to a prior state (whether manually
GC> or with a hook that does that automatically) is only a small part of the
GC> work, while rebuilding it is a big task.

 Yes, but the point is that sometimes you do want to rebuild and sometimes
you don't and only you, the human, can decide whether you do or not -- and
not Git or a shell script.

GC> It depends on whether the library must be rebuilt. If I'm bisecting to
GC> find a numerical lmi regression, then a hook that updates a wx submodule
GC> is just a waste of time. If I'm trying to find a wx regression, then I
GC> need to rebuild old wx versions, and I surely don't want a hook in the
GC> lmi repository to do that.

 Exactly.

GC> More generally, I cannot 'git bisect' from the repository's first (2005)
GC> commit to its last: I'd need to rebuild ancient versions of tools that
GC> might not even be available today, or might require scripts or makefiles
GC> that aren't even compatible with today's shell or 'make'.

 This is certainly true as well, although I think the main danger comes
from the changes to C++ compilers and not shell (which is 100% backwards
compatible, AFAIK) or make (which is not quite 100%, but still pretty
conservative).

 FWIW some people actually import all their build tools in the VCS,
including the full compiler etc binaries and maybe even including the OS
images (nowadays they could be Docker images) the tools run on. I've never
done this myself because rebuilding old versions has never been that
important to me, but this is definitely doable and it's hard to argue
against doing it in principle -- it's just in practice, this does have a
lot of costs.

GC> I guess the conclusion is that using external libraries introduces
GC> complexities; and bringing such libraries into git submodules does make
GC> those complexities somewhat less difficult to manage, but does not
GC> magically eliminate them.

 No, but it makes it much simpler to use different versions of the same
dependency, including one with project-specific custom changes, and switch
between them. Of course, you could -- and you did -- do it before with
scripts applying patches to the downloaded files, but using submodules
simplifies this a lot.

 Also note that I've tried to change the existing use of dependencies and
of lmi build system in general as little as possible, so its use of
submodules is a bit idiosyncratic: it doesn't actually use any files from
the submodules, other than in the install_*.sh scripts that build and
install the dependencies from their sources. It's much more typical to use
the headers directly and build the libraries directly in the build
directory of the main project being built, which makes things a bit
simpler. But I don't propose to change this because the existing system
seems to work well enough and I don't think we'd gain that much from
changing it.

GC> Well, a magical solution is possible--just
GC> redesign the build system so that all libraries are dependencies--but
GC> that's so much costly magic that we wouldn't want to do it.

 I'm not sure why do you think this is so costly? And don't we practically
do this already? I'm afraid I might be missing your point here.

 FWIW the current setup is quite close to ideal from my point of view. The
repository is practically self-contained (with the exception of build
tools, OS images, schematics for constructing the semiconductor factories
needed to produce the CPUs all this runs on and other minor details) and
updating the dependencies is as simple and painless as I can imagine it to
be. What problems do you see with it, other than having to run "git
submodule update" after switching branches/commits?

 I'm probably too set up in my own ways of working and could be missing
something obvious, so I'm genuinely curious to understand your point of
view.

 Thanks,
VZ

Attachment: pgpTIESSp6Whf.pgp
Description: PGP signature


reply via email to

[Prev in Thread] Current Thread [Next in Thread]