automake
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Help with static linking


From: Paul Smith
Subject: Re: Help with static linking
Date: Sun, 02 Jun 2013 10:45:06 -0400

I'm removing automake from this thread as I'm getting two copies of
every mail.  Hope no one minds.

On Sun, 2013-06-02 at 03:06 -0400, Mike Frysinger wrote:
> On Sunday 02 June 2013 01:10:36 Kip Warner wrote:
> > On Sat, 2013-06-01 at 23:14 -0400, Mike Frysinger wrote:
> > > be aware that what ever version of glibc & gcc you use to build, the end
> > > user cannot have a version older than that or it'll fail to start
> > 
> > Do you mean in the case of dynamic linking? If so, that's awful. But
> > strange because I've seen many upstream projects release precompiled
> > binaries without that ever really being made into an issue. Or do you
> > mean in the case of static linking?
> 
> i mean dynamic linking.  it's always been this way.

This is because GCC has some of its internal functionality implemented
in libraries, which are linked dynamically by default.  This is
especially true if your program is written in C++, because the STL is
provided with the compiler and is very compiler-specific.  However, even
some low-level C functionality is implemented as a shared library.

If the runtime system has an older version of the compiler with older
versions of these libraries, you can run into trouble (again, C++ is the
biggest culprit: the C helper libraries are pretty stable, release to
release, in general).

This is easily solved, though.  You just have to add the -static-libgcc
and, if you use C++, -static-libstdc++ to the link line, then these
helper libraries are linked statically and doesn't matter what version
of GCC is installed on the system.

These libraries have a special exception to the GPL, which allows them
to be statically linked in straightforward situations without the result
being GPL-licensed.  See the license for details.

> people who do binary releases often times find an old distro that works and 
> then upgrade packages as need be.  then they keep that image around forever.

This is a different issue than the compiler version.  The above solution
lets you use a newer _compiler_.  This problem relates to the
_distribution_ (that is, the version of libc, etc.)

As Mike says, GNU/Linux distros generally guarantee that if you build
against version X of a system it will run without problems on any
version >=X (note here I'm talking mostly about basic system libraries:
the higher up into userspace you go the less reliable such statements
become).  However, there is no guarantee about running on version <X,
and in fact that very often does not work.  This is not really
surprising: you can't guarantee perfect compatibility, forward and
backward, for all time!

However, using the "old image" method is, IMO, not a good solution for
any larger-scale development.  It's slow, difficult to manage, and
generally painful.

My recommendation for this situation is to instead create a "sysroot",
which is basically a directory structure containing the dev files for a
given distribution: header files and libraries (.so and .a).  You don't
need any bin, man, etc. files.  Pick a pretty old distribution (the
oldest that you want to support).  The current-minus-one Debian or Red
Hat distros are good choices, generally, because they usually have such
old versions of everything that it's unusual to find another mainstream
distro with older versions.

Alternatively, if you prefer to distribute different packages for
different systems, you can create multiple sysroots: one for each
system.  They're not that big and are easy to manage.

Then use GCC's --sysroot flag to point the compiler at the sysroot
directory structure rather than your system's headers and libraries.

Now your build environment is portable and completely divorced from the
version of the underlying build system and the result will run on any
distribution which has the same or newer versions of libraries as your
sysroot.

It's a bit of effort to set up but it's a LOT nicer than dealing with
virtual machines with older releases loaded, or whatever.


Regarding autoconf: using the above setup you have two choices I can
see.  First you can just go with it as-is, and hope the result works
well (it probably will).  Second you can try to build your packages as
if you were cross-compiling, which is a little safer since autoconf
won't try to use your build system to infer things about your host
system, if it detects that they're different.  However, not all packages
have perfect support for cross-compilation so it may be more work.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]