[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: Long command lines with xargs

From: Grandquist, Dean
Subject: RE: Long command lines with xargs
Date: Tue, 21 Jan 2003 16:49:19 -0800

That is not a bad idea. But our code base is old (7+ years and 7+ products), I don't think that will "solve" the command line length limit problem.
A quick test on one new file and one old file suggests the 400 includes would be reduced to around 40. nice. 8-)
The real problem is that the depend still needs the 400, and we rebuild dependancies when the file changes.
-----Original Message-----
From: Bijal Shah [mailto:address@hidden
Sent: Tuesday, January 21, 2003 11:33 AM
To: address@hidden
Subject: RE: Long command lines with xargs

You could re-engineer the code? :)
Seriously, the existing code base is not exactly good software engineering practice, but we all have to live with situations like it. What can you do? Well, consider tackling the problem from a slightly different perspective - "what to I need in order to compile this file".
This might sound a little bizarre but I was wondering if it was possible to run analysis on each C file so that you can reduce your command line length - i.e. only use the -I directives that you really need rather than try to circumvent shell limitations.
Doing a makedepend type thing would result in a rule of the form (see the make manual or Paul Smiths GMAKE webpage).
x.o: x.c /a/a.h /b/b.h /c/c.h
It should be possible to add commands so that you get "-I/a -I/b -I/c" rather than "-Ia -Ib -Ic -Id -Ie ..." , which you could write to an intermediate makefile and then kick off to compile the code. It shouldn't be that hard:
- use $(filter) to get the header file list
- use $(dir) to get the directories from the header list
- process the list in shell to remove duplicates
- write into a makefile
Just an idea, not complete and the thinking is muddy after a long day but might be workable.
-----Original Message-----
From: Grandquist, Dean [mailto:address@hidden
Sent: 21 January 2003 04:02
To: address@hidden
Subject: Long command lines with xargs

I am having a bit of a problem finding out the "good" way to pass very long command lines to gcc with make.

Our build system is based on an existing code base that has 400 directories of files. The C code has includes from any of the 400 directories without using any paths in the #include line. Thus all 400 directories need to be passed as -I directives to gcc.

Here is a sub set of my rules for generating the gcc line:
# ALL_PROJECT_DIRECTORIES is a list of the directories that contain code for the project. The variable that defines that is in a makefile ( that is included in the base makefile.


Project/makefilescripts/outfiles/$(TARGET)/%.o: %.c
        $(CC) $(MAKEFILE_CFLAGS) -c $< -o $@

This gives me a command line that is too long for bash. It looks like POSIX defines a command line length of 2048. I need command line lengths around 10,000 chars to use this makefile method.

$ bash --version
GNU bash, version 2.05b.0(8)-release (i686-pc-cygwin)
Copyright (C) 2002 Free Software Foundation, Inc.

If I use xargs and generate the response file then my make file runs very slow. I get stuck in the catch 22 where I have a variable in make that is greater than the command line limit and I need to write it to a file.

The way that works (very slow) is the following:

        @echo $@
        $(foreach ar,$(MAKEFILE_CFLAGS), $(shell echo " "$(ar)>>$@))

# change the compile line to this
Project/makefilescripts/outfiles/$(TARGET)/%.o: %.c
        xargs $(CC) -c $< -o $@ < Project/makefilescripts/outfiles/$(TARGET)/c_compile_args

This same problems exists for my link line except the line is much longer, about 30,000 chars. I could fix the link to use a collection of libs, one for each subdirectory, but that does not fix my compiling problem.

If I could change the command line limit in bash that would solve my problem too, but that is a question for a different list 8-)

Any ideas about how I can write a variable to a file without going to the shell a few hundred times?  A build of a single changed file went from 5 seconds to 30 seconds because of my xargs $(foreach) loop.

--Dean Grandquist
Software Engineer
Electronic Arts

reply via email to

[Prev in Thread] Current Thread [Next in Thread]