[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: gnulib broken on systems lacking fchdir
From: |
Jim Meyering |
Subject: |
Re: gnulib broken on systems lacking fchdir |
Date: |
Sat, 02 Dec 2006 13:26:39 +0100 |
address@hidden (Karl Berry) wrote:
> since no one cares if gzip can't recursively compress or decompress a
> hierarchy that's really deep or that contains very long names.
>
> Really?
Hi Karl!
I figure if someone cared, it would have been fixed in the last 10 years ;-)
> Well, I guess the deepest things gzip would operate on is distributions
> of some sort. That probably doesn't compare to the monstrous stuff you
> make coreutils handle.
Exactly.
Have you seen a distribution containing a hierarchy deeper than say 10 or
20 levels? Or containing a relative name even approaching PATH_MAX?
It's not like du or rm, where failure to operate on a very deep hierarchy
is more important. I've run benchmarks of those two on hierarchies
1 million levels deep. BTW, both complete in under 10 seconds on a
Linux tmpfs file system.
> At least, I personally have never had gzip -r fail :). (But then, I've
> never had cp -r fail [in the "old" implementation], either.)
>
> Here, simple is better, since gzip has to be so portable.
>
> More portable than coreutils?
I don't *know*. Just assumed that gzip's amiga, atari, primos, etc.
directories contain useful portability hooks. AFAIK, no one is building
coreutils on such systems.
Besides, from a system/kernel-interface standpoint, gzip has far fewer
requirements than coreutils, so it can easily afford to be more portable.