info-cvs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Versioning large files


From: Paul Gelderblom \(ptok\)
Subject: Re: Versioning large files
Date: Mon, 16 Feb 2004 20:41:31 +0100

> William Mallouk writes:
> > We need to version large files (400MB+) such as cd-rom images and
> > large binaries. We realized that CVS gets very slow when versioning
> > these files. Did anybody come across the same problem? Any suggestions?
>
Larry Jones writes:
> Don't use CVS.

Jep, right, but some of us like CVS because it is an industry strengh
versioning system and very stable. You do not want to use a separate
versioning system just for some binary files.
For example on web sites,  source files and image files are often maintained
in the same directory structure.

It would be nice if CVS had an additional (parallell) system for versioning
binary files in a less resource-intensive manner, but it probably will never
come to that. (I would say, make a hack of the rcs file format which just
contains versioning information and tags etc, and links to basically copies
of the binary file versions in the same directory. Add a checksum to
maintain integrity. It may not be subtle, but it will certainly work. And
disk space is cheap.)

Anyhow, William, you asked for suggestions, here is mine: if you do use CVS
and do include the large binary files, make sure you dimension your server
properly. Doing a  checkout of a large binary file with many versions is a
task that consumes significant memory on the server. It may be so slow
because the server is swapping like hell. My experience is that increasing
core memory (and swap file size) on the server can help a lot. Throw a Gb of
mem at it.

Paul Gelderblom








reply via email to

[Prev in Thread] Current Thread [Next in Thread]