info-cvs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Medium sized binaries, lots of commits and performance


From: Paul Sander
Subject: Re: Medium sized binaries, lots of commits and performance
Date: Wed, 9 Feb 2005 11:12:01 -0800

Larry gave a great description of why you're seeing your performance degrade over time. As you can see, the more versions sit between the head and the version you want, the longer it takes to construct the version you want.

I can think of two effective and usable ways to combat the problem, plus one marginal one. All of them essentially move the version you want closer to the head.

The first method is your MIGRATE method, which is a time-honored technique with CVS.

The second, which I believe was mentioned, is to reduce the number of revisions by obsoleting those that are no longer needed. This is the marginal technique because history is lost, and the nature of the differences may not buy you anything.

The third method is to spawn new branches off the head and merge the latest versions of your existing branches onto the new branches, then convert your process to use the new branches instead. This must be repeated periodically to keep a cap on response time.

On Feb 9, 2005, at 7:37 AM, address@hidden wrote:

Hi folks,

I've searched the net and mail archives for some help or workaround to
my problem, but most binary issues tend to deal with the impossibility
of diff/merge or whether very large files can be stuffed into CVS.

I and the rest of us out here work with Oracle Forms and that means
binary source code. At first I was very suspicious of moving to CVS
because we were having binary source code, but as it turns out I and
everyone else have become extremely happy with CVS. We can't merge,
granted, but with our external diff application we reap enormous
benefits from using CVS. Even branching is manageable.

But here's the problem, especially with our largest 3,5 MB file that's
been committed approx. 70 times. When doing a

        cvs update -r HEAD <filename>

things work real fast (5 seconds). But if we do a

        cvs update -r <branch version> <filename>

performance drops from 5 seconds to a minute and a half. I can imagine
something ugly happening with the "filename,v" file on the cvs server
which is 200 MB large.

The performance isn't killing us right now, but in maybe 6 months to a
year, who knows how bad it may have gotten?

So the question is if there are any administrative tools one can use to
compress/rationalize/index the file so branch access becomes faster? Is
there a way to permanently erase "stuff older than 6 months"?

And if not: opinions about my ideas below would be great? My ideas so
far:

MOVE variant: I wouldn't _like_ to lose the history of the application,
but it might be acceptable if performance degrades too much. I figure I
could move the filename,v file on the cvsroot repository (to a "backup"
folder), then delete from client and add a fresh one and the 1-2 active
branches - but can any history be kept if you do this? Will the old
history be in the "backup" folder?

MIGRATE: An alternative would be to create a new folder (while keeping
the old one) and simply migrate _all_ 85 files to the new folder (grab
HEAD, add all in HEAD to new folder, grab endpoints on branches, add all
branches as I best can).

Regards,

Jesper Vad Kristensen
Aarhus, Denmark


_______________________________________________
Info-cvs mailing list
address@hidden
http://lists.gnu.org/mailman/listinfo/info-cvs

--
Paul Sander       | "To do two things at once is to do neither"
address@hidden | Publilius Syrus, Roman philosopher, 100 B.C.





reply via email to

[Prev in Thread] Current Thread [Next in Thread]