[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: * limitation?
From: |
Alfred M. Szmidt |
Subject: |
Re: * limitation? |
Date: |
11 Jul 2002 23:52:47 +0200 |
User-agent: |
Gnus/5.09 (Gnus v5.9.0) Emacs/21.3.50 |
"Tony A. Lambley" <address@hidden> writes:
> I find it interesting that the `ls' works for an obscene number of
> files, yet `ls *' doesn't. bash fails nicely as above, but /bin/sh
> either returns "Segmentation fault" or kills my connection!
This is actually a bug in your system because it cannot handle
arguments longer than ARG_MAX (or some such). `ls *' gets expanded to
an insanely long argument list (bigger than ARG_MAX) which causes
exec() to fail, and in turn makes bash output an error. Where `ls'
works because it doesn't parse any arguments. On GNU this limit
doesn't exist, and the argument length is only limited by available
memory.
I have no idea about /bin/sh (is it a link to bash? Is it the Bourne
shell?), so I can't comment on that.
> Any ideas (other than "don't get involved with legacy systems that store
> lots of files in a single dir")?
There is no such "policy" for GNU projects. See above.
To quote (standards)Semantics:
Avoid arbitrary limits on the length or number of _any_ data
structure, including file names, lines, files, and symbols, by
allocating all data structures dynamically. In most Unix utilities,
"long lines are silently truncated". This is not acceptable in a GNU
utility.
Cheers,
--
Alfred M. Szmidt