lmi
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [lmi] autotools versus makefiles


From: Greg Chicares
Subject: Re: [lmi] autotools versus makefiles
Date: Sat, 21 Jan 2006 16:48:12 +0000
User-agent: Mozilla Thunderbird 1.0.2 (Windows/20050317)

This is lengthy, but by considering the details I think we can find a
way to support both paradigms with less work and more consistency.

On 2006-1-13 22:41 UTC, Vadim Zeitlin wrote:
> On Fri, 13 Jan 2006 11:15:46 +0000 Greg Chicares <address@hidden> wrote:
> 
> GC> but we have far fewer such blocks than, say, 'sed' does. Macros
> GC> like LMI_COMPILER_PROVIDES_STRTOLD come from a compiler-specific
> GC> 'config_*.hpp' file that's written by hand.
> GC> 
> GC> Here's a real advantage of autoconf: it could write those macros
> GC> for us. The offsetting disadvantage is that it's one more tool to
> GC> learn and maintain.
> GC> 
> GC> I don't see a compelling case for adopting autotools. What am I
> GC> missing?
> 
>  I don't think you miss anything, you've summarized the main advantage of
> autoconf in an excellent way. It just allows you to write a test for
> whatever function you want to use once instead of writing config.hpp
> manually for each supported compiler. Of course, the gain is especially
> noticeable when you use many compilers: autoconf is clearly overkill if you
> only use one system and one compiler. LMI is in somewhat middle ground: it
> is currently built under Windows and Unix but using one compiler only. So
> the advantage of autoconf right now is debatable. But I do believe that if
> we plan to support more compilers, it can become quite important.

I'd very much like to support more compilers. I recently committed a change
that lets a non-wx version of lmi build with como, though it doesn't run
correctly yet and I don't have time now to look into the run-time errors.

How would autotools make this easier? I find a thread starting here
  http://sources.redhat.com/ml/automake/2001-05/msg00359.html
that seems to suggest that it's difficult to use automake with a compiler
other than gcc. OTOH, this message
  http://sources.redhat.com/ml/cygwin/2000-10/msg00347.html
| You might just want to use automake, autoconf and libtool.
| AFAIK, they are supporting VC++ syntax.
suggests that gnu autotools supports msvc--as a special case, I would
presume, due to its widespread use.

Yet como is very different. Here's a real problem: either of these commands
  make -f como_4_3_3.make unit_tests
  make -f como_4_3_3.make lmi_cli_monolithic.exe
works fine after removing the como object directory, but running them in
succession without removing that directory results in:

  C++ prelinker: error:
    T1 std::fill_n<T1, T2, T3>(T1, T2, const T3 &)
      [with T1=double *, T2=unsigned int, T3=double]
    assigned to input_sequence.o and actuarial_table.o
  C++ prelinker: error:
    bad instantiation request file -- instantiation assigned to more
    than one file

I guess this explains it:

  http://groups.google.com/group/comp.sys.sgi.bugs/msg/ce6cbc0b03378304
| You are probably linking the same .o file into two or more a.out files.
|
| Currently, this is broken in the 5.3 C++ compiler: it assumes a model where
| each .o is linked into exactly one a.out or shared library.  If you link
| the same .o into two a.out's, the prelinker gets all confused, and emits
| that message.

and I guess that's just the way EDG-based compilers work:

http://www.codesourcery.com/archives/pooma-dev/msg00759.html
|> Does gcc have any support for this sort of thing?  With
|> KCC, or any EDG based compiler, files are first compiled without
|> instantiating any templates and a database is constructed which tells the
|> compiler what templates can be instantiated in any given translation unit.
|> Then the prelinker uses this database and it's knowledge of what templates
|> need to be instantiated to assign templates to a translation unit and then
|> that translation unit is recompiled.  Does gcc handle template instantiation
|> differently and if so, how?

>From the EDG POV, then, gcc is anomalous. But autotools were written for gcc
(and apparently extended to msvc), and from that POV, EDG is anomalous, and
I don't see much hope that automake has a canned solution for it.

>  A few other things:
> 
> - tests for functions can be written manually,

I believe that only these four C99 functions have proved to be problematic:
  expm1(), log1p(), snprintf(), strtold()
but they're all used only in very isolated cases. We have macros like
  LMI_COMPILER_PROVIDES_EXPM1, LMI_COMPILER_PROVIDES_STRTOLD
and 'config_*.hpp' files to define them. We need another such file for each
new compiler. We can write it by hand, or autoconf can write it. So far, it
sounds like the choice is manual labor versus elegant but heavyweight
automation; but let's rethink this.

The autotools paradigm works something like this:
  memmove(x, y) is the same as bcopy(y,x), so...
  if you have memmove(), then use it;
  else if you have bcopy(), then use it, swapping arguments;
  else ...give up and report an error.
But we're actually outside that paradigm:
  expm1(x)   is not really the same as exp(x)-1
  snprintf() is not really the same as sprintf()
and a function named snprintf() might be implemented incorrectly: borland
got it wrong, and so did the msvc rtl used by mingw.

I think we should treat expm1() like this (untested):

  #if !defined LMI_COMPILER_PROVIDES_EXPM1
  inline double expm1(double x)
      {
      global_untrustworthy_flag = true;
      return std::exp(x) - 1.0;
      }
  #endif // !defined LMI_COMPILER_PROVIDES_STRTOLD

  int main()
      {
      if(global_untrustworthy_flag)
          warn("Results may be invalid.");

and the other three C99 functions similarly. Thus, the presence of a C99
function is desirable, but exceptional. Any C++98-conforming compiler
will build the system, so autoconf doesn't need to worry about this stuff.
To validate a toolset, you need to validate four C99 functions (that's
the hard part, but we have unit tests for it) and add a 'config_*.hpp'
file (trivial in comparison).

> but configure may also do
>   other compile-time checks, e.g. verify whether the correct version of
>   wxWidgets is available (and built with correct options),

IIRC, building wx with autotools creates 'wx-config', a script that makes
it easy to retrieve the options wx was built with, so that we wouldn't
need to write, e.g.,
  -DWXUSINGDLL
  -D__WXDEBUG__
  -DNO_GCC_PRAGMA
in the lmi makefiles (and manually keep them coordinated with the wx build
options we're using, which may change over time).

But this can be done in the makefiles, too, can't it? I haven't tried it:
I have no 'wx-config', probably because I built wx with its makefiles.
I wouldn't mind using 'wx-config', though, if I've understood it correctly.
Is there a way to get 'wx-config' from wx's 'config.h' without autotools?

> whether we have
>   boost installed (or need to build it as part of lmi)

People who actually use lmi won't have boost installed. Our niche is narrow.
We have to accommodate people who know little about such tools. That's why
we're writing 'setup.make' to create an appropriate environment from scratch.

Most boost libraries are implemented in headers only, and we only need to
compile four boost files, so it's not a large problem. In fact, it's a tidy
solution to a versioning problem. I want to follow this advice:

http://boost.org/more/faq.htm
| backward compatibility with prior version isn't always possible. Deal with
| this by freezing the version of the Boost libraries used by your project.
| Only upgrade at points in your project's life cycle where a bit of change
| will not cause problems.

and anyone who has boost installed probably has a later version than the one
we use. At best, the autotools approach might save compiling four files, but
its overhead would probably more than negate that savings. Still, if we can
find an efficient way to keep lmi autotoolized, I wouldn't object to letting
others have the freedom to use an existing boost installation.

> and so on -- this is
>   very nice IMO and is impossible without using a separate tool and so the
>   only way to do it is by manually specifying the build parameters and, of
>   course, "manually" means more space for mistakes

Aside from wx and mpatrol, we use four third-party libraries.

1) cgicc uses autotools, but cgicc-3.2.3, the current version last time
I checked, fails to build with MSYS. We use cgicc-3.1.4, which also
uses autotools, but it doesn't build out of the box with MSYS--it's
just easier to fix than cgicc-3.2.3 .

2) libxml2 supports autotools, and libxml2-2.6.19 appears to build with
MSYS, though we haven't validated that version for use with lmi yet.
I'm very happy to use an autotools-built libxml2 that simply works out
of the box.

3) xmlwrapp has its own custom build system. (You're about to replace
it with libxml++, but we're still using it today.)

4) boost has its own custom build system.

We can't require naive potential users to install custom build systems.
We can compile
  nine cgicc files
  five xmlwrapp files
  four boost files
by treating them as though they were lmi source files. That way, we use the
same compiler, with the same options, for these eighteen files as for lmi's
186 '.c' and '.cpp' files. I believe this makes mistakes less likely, not
more likely.

> - autoconf makes integration with other tools usually easier because it is
>   a well-known standard; e.g. creating Debian (or cygwin, for that matter)
>   packages is much simpler if you have an autoconf-based build system

OK. Are there any other tools, besides debian's package facility and redhat's,
that work more easily with autoconf?

> - we gain a few very nice features for free when using autotools:
>    . standard configure arguments such as --prefix or --enable-debug:
>      it's really convenient to be able to specify them in a usual way
>      instead of having to modify the makefiles or pass the flags on
>      command line

Is this only the difference between
  configure --enable-debug --prefix='/foo/bar' && make
and
  make CXXFLAGS='-g' prefix='/foo/bar'
? Isn't that just a matter of taste? Still, if we can find a way to support
autotools easily, I don't mind accommodating others' tastes this way.

>    . possibility to build in another directory: this is much more tidy than
>      building in the source directory

The lmi makefiles do that already.

>    . make targets such as install, dist, ... are implemented automatically
>      (this is done by automake and not autoconf but it doesn't matter)

We have an 'install' target, and it uses 'prefix' and 'exec_prefix'
as prescribed here, AFAIK:
  http://www.gnu.org/software/make/manual/html_chapter/make_14.html#SEC131
We have a 'test' target; maybe I should name it 'check' to conform to
typical GNU practice. We have 'clean', 'distclean', 'mostlyclean', and
'maintainer-clean' just to conform to GNU standards, even though they all
happen to do the same thing for now.

We don't have 'dist', but we do have 'archive' to serve the same purpose.
It uses bzip2. I wouldn't mind adding a 'dist' target that would use gzip
instead. That's easy to do, and would make the system seem more familiar
to developers who understand such conventions. But it would exist just for
appearance's sake. It isn't actually useful, because our goal is to keep
cvs releasable at every moment, not just on infrequent occasions when a
release is declared.

>    . without speaking of automatic dependency tracking

For gcc and perhaps msvc. What about borland? What about como? Anyway, we
already use the autodependency method Tom Tromey invented for automake.

> (and hopefully in
>      the future automatic precompiled headers support)

For gcc only? What about the EDG method used by como?

>  Speaking of automake, I do believe that Makefile.am is much more simple
> (and hence maintenable) than its equivalent(s) in the manual build system.
> Of course, it probably doesn't do everything it does but still.

It's more familiar to people who have maintained other packages that use
autotools. But it does far less than 'GNUmakefile' and 'workhorse.make'.
Really, it parallels only 'objects.make'.

I took a copy of 'Makefile.am' and removed everything that simply duplicated
'objects.make'. Only thirty lines remained out of the original 757, so the
two files are somewhere between 90 and 100% equivalent. Compare:

'Makefile.am':
  test_zero_SOURCES = \
    zero_test.cpp
  test_zero_CXXFLAGS = $(AM_CXXFLAGS)
  test_zero_LDADD = \
    libtest_commons.la
      zero.hpp

'objects.make':
  zero_test$(EXEEXT): \
    $(common_test_objects) \
    zero_test.o \

Instead of maintaining these two files in parallel, I think our time would
be better spent on finding a way to make them conformable, so that only one
would take any real work to maintain. For example:

  zero_test_objects := \
    $(common_test_objects) \
    zero_test.o \

  zero_test$(EXEEXT): $(zero_test_objects)

  .PHONY: %_test.am
  %_test.am:
        @$(ECHO) $*_SOURCES = '$($*_test_objects)'
        @$(ECHO) $*_CXXFLAGS = '$$(AM_CXXFLAGS)'
        @$(ECHO) $*_LDADD = libtest_commons.la

  $make zero_test.am
  zero_SOURCES = alert.o alert_cli.o fenv_lmi.o getopt.o license.o  zero_test.o
  zero_CXXFLAGS = $(AM_CXXFLAGS)
  zero_LDADD = libtest_commons.la

Does automake really require that ugly_mixture_of_lower_and_UPPER_CASE?
I could put all the prerequisites like
  zero_test_objects := \
    $(common_test_objects) \
    zero_test.o \
in one file, then just echo the values of those variables; but I don't want
to have that ugly naming style forced upon me. And is
  test_zero_CXXFLAGS = $(AM_CXXFLAGS)
really necessary? Can't automake define a set of default flags?

BTW, this line
      zero.hpp
seems out of place. If it doesn't belong there, then that's a problem we
could avoid by writing a makefile target to generate 'Makefile.am'.

Also BTW, as we discussed the other day, $(common_test_objects) is a list
of objects because that lets me work more efficiently; putting them into
a 'libtest_commons.la' library makes my work harder. I believe some tests
won't work with the library approach if we use msw dll[im|ex]port. But
generating 'Makefile.am' from the lmi makefiles would prevent discrepancies
like that.

>  Finally, let's put it the other way: what are the advantages of not using
> autotools? We already have a working (barring last minute breakage since
> our last tests) autotools-based system. It works under Linux, Cygwin and
> MSYS

I believe we'd need to update MSYS with the autotools packages that you've
confirmed to work. Without doing that, here's what I get:

~/lmi/src/lmi[127]$autoconf
configure.ac:28: error: possibly undefined macro: AM_INIT_AUTOMAKE
      If this token and others are legitimate, please use m4_pattern_allow.
      See the Autoconf documentation.
configure.ac:55: error: possibly undefined macro: AM_CONDITIONAL
configure.ac:148: error: possibly undefined macro: AC_PROG_LD
configure.ac:288: error: possibly undefined macro: AC_PROG_LIBTOOL
configure.ac:292: error: possibly undefined macro: AC_DISABLE_STATIC
configure.ac:297: error: possibly undefined macro: AM_OPTIONS_WXCONFIG
configure.ac:311: error: possibly undefined macro: AM_PATH_WXCONFIG

~/lmi/src/lmi[1]$./configure
./configure: line 1319: syntax error near unexpected token `config.h"'
./configure: line 1319: `          ac_config_headers="$ac_config_headers 
config.h"'

Or was I supposed to run 'autogen.sh' first?

~/lmi/src/lmi[2]$./autogen.sh
Setting up build system for lmi:
 - aclocal
aclocal: configure.ac: 297: macro `AM_OPTIONS_WXCONFIG' not found in library
aclocal: configure.ac: 311: macro `AM_PATH_WXCONFIG' not found in library
Automatic build files setup failed!

> and normally should require much less maintenance than the original
> makefiles in the future. So what prevents you from using it?

If I use my own makefiles, then I don't need to learn autotools. Neither
do my coworkers. And the lmi makefiles have capabilities that aren't
supported by autotools. Anyway, without being convinced of a compelling
advantage, I'm not going to abandon makefiles for automake, bjam, cmake,
or anything else. Just as switching to wx was costly, but certainly did
have compelling advantages; while switching now to qt would not.

It's important to me to build unit tests with como, and even with borland
to the extent possible: both those compilers have helped me find errors
that happened not to occur with gcc. But making autotools work with como
and borland seems like a vast project. Earlier today I saw a message on
another mailing list, wherein one of the libtool maintainers said
| I know of two failures which we have discussed on the libtool lists but
| not fixed yet; that will likely happen only after MSVC support is done
even though a message I quoted above suggested that msvc support began in
2000. So I have to imagine it's not easy to support another compiler,
particularly an EDG compiler that uses a separate prelinker for templates.

Beyond that, the makefiles do various other things. The 'regression_test'
target is very important to us, for instance. The 'check_idempotence'
target guards against a class of problems not contemplated by autotools.
I don't think it's appropriate for us to spend time making autotools
support these things.

The autotools approach is not without its problems; here's one rant:
  http://freshmeat.net/articles/view/889/
Of course, there are people who like autotools, and I'd like to
accommodate them, though I don't think I'll ever agree with them.

In summary, autotools can be very helpful if you
 - use the C language and want portability to pre-1989 dialects
 - use only gcc or compilers that can be made to work in about the same way
 - have good substitutes for nonstandard C rtl functions or headers
 - need portability to many quirky *nix platforms and toolsets
 - are already familiar with autotools
 - aren't comfortable writing your own makefiles
but not as helpful if you
 - use C++ and don't need to support highly-nonconforming compilers
 - use compilers that work very differently than gcc, like EDG
 - can tell anyone with a quirky platform to install gnu make and g++
 - have more experience with make than with autotools

I don't mind continuing to support autotools as long as we can find a
way to make maintenance simple. Consider:

              revision   bytes
  Makefile.am     1.15   17062
  autogen.sh      1.4     2005
  configure.ac    1.17   22959

'autogen.sh': This is tiny and unlikely to require much maintenance.

'configure.ac': Half the nontrivial changes have served to accommodate the
four C99 functions mentioned above or the __argc problem we removed. I can
rewrite the C99 things as described above--saving half the maintenance if
past experience predicts the future well.

'Makefile.am": I believe my proposal to rewrite 'objects.make' reduces this
file to about one-twentieth of its current size, and makes maintenance more
reliable and easier through automation.

What do you think?




reply via email to

[Prev in Thread] Current Thread [Next in Thread]