[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

error on large directories

From: Morgan Burke
Subject: error on large directories
Date: Thu, 15 May 2003 15:21:32 -0700


I've been plagued for some time by a problem reading large directories 
with CVS.  I have a repository that functions as a database, so it contains
close to 7500 files in 4 directories (1 dir, plus 3 subdirs), totalling
over 700MB and growing.  CVS is used to mirror this database, and to 
extract snapshots as of particular dates.

Under Linux (RH7.3, CVS 1.11.1p1 and 1.11.5), executing "cvs update" on 
this repository generates four error messages (one for each 
directory, I presume):  "cvs update: error reading current directory: 
Value too large for defined data type"

My guess is that the buffer to read directory info into is not large enough.
"strace cvs update ." shows the following line just before one of these 
error messages:

getdents64(0x7, 0x818b428, 0x1000, 0)   = 4072

which I take to mean that it is passing a 4K buffer for dirents64() to 
fill with directory data.  Unfortunately getdents64 is not called directly
by the CVS code, so it's hard for me to pinpoint where the call occurs.
I don't know if the 4K buffer size is meant to correspond to the normal 
Linux directory size of 4K, but FWIW, this particular directory is much 

etc$ ls -ld caldb
drwxrwxr-x    6 e949mgr  dev         86016 May 12 14:09 caldb

This repository is hosted on an SGI server, and is mirrored on other
SGI servers that have no problems performing the same update, so the 
bug appears to be Linux-specific.

I surfed the source looking for something that might control the 
directory read buffer size, but nothing obvious jumped out at me.  I also 
looked for the src/options.h file, as recommended by the INSTALL file,
but it doesn't appear to exist.

Any other suggestions?

-- Morgan Burke

reply via email to

[Prev in Thread] Current Thread [Next in Thread]