[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]


From: Bob Proulx
Subject: Re: [FAQ] ARG_MAX and POSIX
Date: Mon, 3 Sep 2007 16:20:26 -0600
User-agent: Mutt/1.5.9i

Sven Mascheck wrote:
> The FAQ, item 19. "Argument list too long", reads about ARG_MAX:
>    "POSIX only requires 20k which was the traditional value used
>     for probably 20 years."
> In fact, this minimum is 4096. (It's accessible in the shell with
> "getconf _POSIX_ARG_MAX" on such a system.  This value determines
> only the minimum, POSIX also knows ARG_MAX for the actual value.)
> http://www.opengroup.org/onlinepubs/009695399/basedefs/limits.h.html

Thank you very much for that correction.  I have updated that FAQ
entry.  This prompted me to rework it substantially.  Here is a URL
and I will include the updated entry at the end of this message too.


If you find other improvements please send them along.

> <http://www.in-ulm.de/~mascheck/various/argmax/>

And I included a link to your reference page.  Good stuff there.


Here is the current reworked entry.

@c ---------------------------------------------------------------------------
@node Argument list too long
@chapter Argument list too long

I tried to move about 5000 files with mv, but it said:
  $ mv * /some/other/directory/
  bash: /bin/mv: Argument list too long
@end example

The traditional Unix operating system kernel has a fixed limit of
memory available for program environment and argument list combined.
Into this combined program environment and argument space all of the
environment and all of the command line arguments must be fit.  If
they don't fit then the external program execution will fail.  In the
above example the @code{*} is expanded by the shell into a literal
list of filenames to be passed as command line arguments to the
@command{mv} program.  If the number of files in the current directory
to be expanded is larger than this fixed buffer space including the
environment space then they won't fit.  In that case it is impossible
to execute the program with the specified list of arguments and the
current environment size.

Note that the ``Argument list too long'' message came from the
@command{bash} shell command line interpreter.  Its job is to expand
command line wildcard characters that match filenames.  It expands
them before any program can access them.  This is common to all
programs on most Unix-like operating systems.  It cannot exceed the OS
limit of ARG_MAX and if it tries to do so the error ``Argument list
too long'' is returned to the shell and the shell returns it to you.

This is not a bug in @command{mv} or other utilities nor is it a bug
in @command{bash} or any other shell.  It is an architecture
limitation of Unix-like operating systems.  The @command{mv} program
was prevented by the OS from running and the shell is just the one in
the middle reporting the problem. The shell tried to load the program
but the OS ran out of space.  However, this problem is one that is
easily worked around using the supplied utilities.  Please review the
documentation on @command{find} and @command{xargs} for one possible
combination of programs that work well.

The value can vary per operating system.  POSIX
only requires 4k} as the minimum amount.  Newer operating systems
releases usually increase this to a somewhat larger value.  On my
Linux system (2.2.12) that amount is 128k bytes.  On my HP-UX system
(11.0) that amount is 2M bytes.  HP-UX provided 20k for many years
before increasing it to 2M.  Sven Mascheck
wrote to the mailing list} that early systems such as Unix V7, System
III, 3BSD used 5120 bytes and that early 4BSD and System V used 10240
bytes.  More information about ARG_MAX from Sven Mascheck is available
on his page @uref{http://www.in-ulm.de/~mascheck/various/argmax/}.

The @command{getconf} command may be used to print the currently
implemented limit.

  getconf ARG_MAX
@end example

It might occur to you to increase the value of ARG_MAX such as by
recompiling the kernel with a larger value but I advise against it.
Any limit, even if large, is still a limit.  As long as it exists then
it should be worked around for robust script operation.  On the
command line most of us ignore it unless we exceed it at which time we
fall back to more robust methods.

Here is an example using chmod where exceeding ARG_MAX argument length
is avoided by using @command{find} and @command{xargs}.

  find htdocs -name '*.html' -print0 | xargs -0 chmod a+r
@end example

Read the previous question for another facet of this problem.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]