[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: Long command lines with xargs

From: Bijal Shah
Subject: RE: Long command lines with xargs
Date: Wed, 22 Jan 2003 09:18:46 -0000

I believe the great greek Homer had it right when he said "Doh!". Forgot about that. Thats what comes of trying to think when the brain isn't ready to go any more.
I think what I would do would be to write a shell/python script or a small program to go do my dependencies - its not a hard one to write and there are a few examples on the web in the guise of "makedepend" clones. You can embed your directory structure in the program or give it a root directory on the command line and get it to build a list of sub-directories. Once you do this, you could use the approach outlined for the compilation step.
Another thing to try might be to use the CPATH environment variable (see the GCC manual). Setting it in the makefile should mean it gets picked up by a gcc invokation from that makefile, though you might have to export it from the makefile. I don't know if there is an environment variable length limit in shell though.
Finally, check out I have found this to be very useful indeed to analyse source code, and find your way around quite large projects. It builds a database of a source tree and allows broswe through the whole tree. I use it myself for 1 specific reason - it has a project wide search and replace - invaluable when I was refactoring an old codebase to get around similar problems to yours. I used to change #include <fred.h> to #include <location/fred.h> which meant that I could set my include paths up one directory. It is Win32 based but pretty good anyway :)
-----Original Message-----
From: Grandquist, Dean [mailto:address@hidden
Sent: 22 January 2003 00:49
To: address@hidden
Subject: RE: Long command lines with xargs

That is not a bad idea. But our code base is old (7+ years and 7+ products), I don't think that will "solve" the command line length limit problem.
A quick test on one new file and one old file suggests the 400 includes would be reduced to around 40. nice. 8-)
The real problem is that the depend still needs the 400, and we rebuild dependancies when the file changes.
-----Original Message-----
From: Bijal Shah [mailto:address@hidden
Sent: Tuesday, January 21, 2003 11:33 AM
To: address@hidden
Subject: RE: Long command lines with xargs

You could re-engineer the code? :)
Seriously, the existing code base is not exactly good software engineering practice, but we all have to live with situations like it. What can you do? Well, consider tackling the problem from a slightly different perspective - "what to I need in order to compile this file".
This might sound a little bizarre but I was wondering if it was possible to run analysis on each C file so that you can reduce your command line length - i.e. only use the -I directives that you really need rather than try to circumvent shell limitations.
Doing a makedepend type thing would result in a rule of the form (see the make manual or Paul Smiths GMAKE webpage).
x.o: x.c /a/a.h /b/b.h /c/c.h
It should be possible to add commands so that you get "-I/a -I/b -I/c" rather than "-Ia -Ib -Ic -Id -Ie ..." , which you could write to an intermediate makefile and then kick off to compile the code. It shouldn't be that hard:
- use $(filter) to get the header file list
- use $(dir) to get the directories from the header list
- process the list in shell to remove duplicates
- write into a makefile
Just an idea, not complete and the thinking is muddy after a long day but might be workable.
-----Original Message-----
From: Grandquist, Dean [mailto:address@hidden
Sent: 21 January 2003 04:02
To: address@hidden
Subject: Long command lines with xargs

I am having a bit of a problem finding out the "good" way to pass very long command lines to gcc with make.

Our build system is based on an existing code base that has 400 directories of files. The C code has includes from any of the 400 directories without using any paths in the #include line. Thus all 400 directories need to be passed as -I directives to gcc.

Here is a sub set of my rules for generating the gcc line:
# ALL_PROJECT_DIRECTORIES is a list of the directories that contain code for the project. The variable that defines that is in a makefile ( that is included in the base makefile.


Project/makefilescripts/outfiles/$(TARGET)/%.o: %.c
        $(CC) $(MAKEFILE_CFLAGS) -c $< -o $@

This gives me a command line that is too long for bash. It looks like POSIX defines a command line length of 2048. I need command line lengths around 10,000 chars to use this makefile method.

$ bash --version
GNU bash, version 2.05b.0(8)-release (i686-pc-cygwin)
Copyright (C) 2002 Free Software Foundation, Inc.

If I use xargs and generate the response file then my make file runs very slow. I get stuck in the catch 22 where I have a variable in make that is greater than the command line limit and I need to write it to a file.

The way that works (very slow) is the following:

        @echo $@
        $(foreach ar,$(MAKEFILE_CFLAGS), $(shell echo " "$(ar)>>$@))

# change the compile line to this
Project/makefilescripts/outfiles/$(TARGET)/%.o: %.c
        xargs $(CC) -c $< -o $@ < Project/makefilescripts/outfiles/$(TARGET)/c_compile_args

This same problems exists for my link line except the line is much longer, about 30,000 chars. I could fix the link to use a collection of libs, one for each subdirectory, but that does not fix my compiling problem.

If I could change the command line limit in bash that would solve my problem too, but that is a question for a different list 8-)

Any ideas about how I can write a variable to a file without going to the shell a few hundred times?  A build of a single changed file went from 5 seconds to 30 seconds because of my xargs $(foreach) loop.

--Dean Grandquist
Software Engineer
Electronic Arts

reply via email to

[Prev in Thread] Current Thread [Next in Thread]