bug-fileutils
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: rm command fails


From: Alfred M. Szmidt
Subject: Re: rm command fails
Date: Fri, 27 Dec 2002 19:26:31 +0100

   It's actually a shell limit rather than an rm limit, as I understand it.

Then you understand it incorrectly. From the Coreutils FAQ:

Argument list too long

I tried to move about 5000 files with mv, but it said:

  bash: /bin/mv: Argument list too long

The UNIX operating system traditionally has a fixed limit for the
amount of memory that can be used for a program environment and
argument list combined. You can use getconf to return that limit. On
my Linux system (2.2.12) that amount is 128k. On my HP-UX system
(11.0) that amount is 2M. It can vary per operating system. POSIX only
requires 20k which was the traditional value used for probably 20
years. Newer operating systems releases usually increase that
somewhat.

  getconf ARG_MAX
  131072

Note that your message came from "bash" your shell command line
interpreter. Its job is to expand command line wildcard characters
that match filenames. It expands them before any program can see
them. This is therefore common to all programs on most UNIX-like
operating systems. It cannot exceed the OS limit of ARG_MAX and if it
tries to do so the error "Argument list too long" is returned to the
shell and the shell returns it to you.

This is not a bug in 'mv' or other utilities nor is it a bug in 'bash'
or any other shell. It is an architecture limitation of UNIX-like
operating systems. The 'mv' program was prevented by the OS from
running and the shell is just the one in the middle reporting the
problem. The shell tried to load the program but the OS ran out of
space. However, this problem is one that is easily worked around using
the supplied utilities. Please review the documentation on 'find' and
'xargs' for one possible combination of programs that work well.

You might think about increasing the value of ARG_MAX but I advise
against it. Any limit, even if large, is still a limit. As long as it
exists then it should be worked around for robust script operation. On
the command line most of us ignore it unless we exceed it at which
time we fall back to more robust methods.

Here is an example using chmod where exceeding ARG_MAX argument length
is avoided.

  find htdocs -name '*.html' -print0 | xargs -0 chmod a+r

Read the previous question for another facet of this problem.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]