info-cvs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Large File Revision Control??


From: BCC
Subject: Large File Revision Control??
Date: Thu, 23 Aug 2001 11:20:28 -0700

Hi,

We are currently using cvs for some code versioning control, and would
like to expand it a bit but are running into problems...

What we have are many MySQL tables that must be synchronized across a
network ( this is on linux btw ).  i.e. There is one development
machine, and when I or whoever is working on the DB would like, we can
update 2 other machines with the same databases. Replication was our
first attempt, but since the master was also being used for development,
replication kept crashing.  

Next I thought that it would be great to use cvs- just add my database
into the repository and then when it is clean on the 'master' I would
simply go update it on the other machines.  

But NO!  I couldn't do it, our files are too large.

Several of our database tables are ~800 megs.  Since the machine I am
using to try and add the repository only has 512 megs of ram, I cannot
add the repository.  I get various error messages, from out of memory
(if I try to add many files), to no space left on device when adding one
249 meg file (I think this is because the swap gets used up?).

How can I do this?  The machines that I will be checking out the
repository from each have 2 gigs of ram, so once I have these big files
in the repository it should be okay (?).  I can have the 512 machine
upgraded to 2gigs as well. 

Are there better ways to do this in linux?  Short of tar, gzip, and scp?

Thanks,
Bryan


reply via email to

[Prev in Thread] Current Thread [Next in Thread]