[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: question
From: |
Teala Spitzbarth |
Subject: |
RE: question |
Date: |
Mon, 13 Aug 2001 13:57:08 -0700 |
(Ehem), I've checked tar files as large as 380Mb into
our CVS repository and haven't had any problems -
other than the long time it takes to do pulls or updates
of that module especially over pserver.
I would recommend that if you need to archive tar files
that you put them in a separate module from the rest
of your source - so developers aren't getting stuck
updating the beast into their working copies....
Cheers,
Teala
-----Original Message-----
From: Alexander Kamilewicz [mailto:address@hidden
Sent: Monday, August 13, 2001 1:49 PM
To: address@hidden
Subject: Re: question
> Mark Lancisi wrote:
>
> Does anyone know how cvs handles checkins of large binaries?
> I'm looking to checkin a 30Mb tarball. I've heard cvs might have
> problems with this..
>
> Thanks in advance for any help
>
>
The problem you're most likely to have is that when importing stuff into
CVS, CVS will make a copy of the module in /tmp. If what you're
importing is bigger than /tmp, CVS will fail.
So, in your case, make sure /tmp on the CVS server is >30Mb.
But why check in a tarball? Wouldn't it make more sense to check in the
individual files in the tar?
_______________________________________________
Info-cvs mailing list
address@hidden
http://mail.gnu.org/mailman/listinfo/info-cvs
- question, amina saify, 2001/08/03
- question, Mark Lancisi, 2001/08/13
- RE: question,
Teala Spitzbarth <=