dvdrtools-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Dvdrtools-users] 2GB file limit workaround


From: Allan Peda
Subject: [Dvdrtools-users] 2GB file limit workaround
Date: 13 Jun 2003 16:17:57 -0400

While this is not a mkisofs related topic, it does relate to dvdrecord.
I see so many problems, and few solutions on this list that I decided to
post a solution, of sorts.

I was trying to save multi gigabyte database dumps to dvd, but
limitations withing mkisofs (more specifically the joliet file system)
prevented me from storing these fat files in an iso image.

The workaround is to save the files to a tar file directly, and save
this to the DVD, sans any file system. While this does not result in a
DVD that can be mounted in the normal fashion, it does result in a DVD
that can be treated as a tape, and used for backups.  The big win of
course is that there is no 32 bit floating integer limit on the file
size.

To illustrate:

0.) md5sum the files of interest:
md5sum /data/multiGigFile.dump | tee /data/multiGigFile.md5sum

1.) Create backup tar file: 
tar cf /data/multiGigFile.tar /data/multiGigFile.dump

2.) Shoot this onto the DVD using dvdrecord:
dvdrecord -v -pad -dao dev=1,0,0 speed=4 /data/multiGigFile.tar

3.) Extract it again using dd.  It will be bigger due to padding.
dd if=/dev/cdrom of=/data/multiGigFile_restored.tar

4.) Now compare the contents, by changing to another directory, 
and using tar xvf /data/multiGigFile_restored.tar
and then md5sum of the contents of the restored file.  If the sums
compare then the files should be identical (or try cmp - diff griped
about memory being exhausted) 

So far everything has been good.

I'm sure this could be streamlined with pipes, but I have the disk
space, and am relatively short of RAM, so I'm leaving the files around
for now.  As we've seen before, it's best to compress component files
_before_ placing them in the archive.  I save the uncompressed md5sum
file in the archive as well.  

The bzip2 man pages seem to imply that it has some sort of error
detection, that I have not read about for gzip, so perhaps it's better
for big files for that reason.  



-- 
Allan Peda

Programmer, Gene Array Resource Center
Rockefeller University
Box 203
1230 York Ave
New York, NY 10021-6399

(tel) 212-327-7064
(fax) 212-327-7065





reply via email to

[Prev in Thread] Current Thread [Next in Thread]