dvdrtools-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Dvdrtools-users] 2GB file limit workaround


From: Robert M. Stockmann
Subject: Re: [Dvdrtools-users] 2GB file limit workaround
Date: Sat, 14 Jun 2003 04:00:07 +0200 (CEST)

On Fri, 13 Jun 2003, André Dalle wrote:

I can make upto 4.7 Gig udf iso's without problems here and possibly even
bigger upto the filesize limit on current ext2/ext3 filesystems in 
linux systems running kernel 2.4.18 and higher :

/usr/bin/mkisofs -o /mnt/backup/disc1.raw 
        -l \    # Allow full 31 character filenames.
        -r \    # Generate SUSP and RR records using the Rock
           \    # Ridge protocol to further describe the files on
           \    # the iso9660 filesystem.
        -L \    # Allow ISO9660 filenames to begin with a period.
        -J \    # Generate Joliet directory records in addition to
           \    # regular iso9660 file names.
        -V "Archive disc1" \
        -P "STOCK-2003012301" \
        -p "stock" \
        -A "Archive disc1" \
        "/mnt/data1/disc1"

[jackson:root]:(~)# mkisofs --version
mkisofs 2.0 (i586-mandrake-linux-gnu)

this mkisofs is a component of cdrtools-2.0 which i patched see :

ftp://crashrecovery.org/pub/linux/cdrtools/

Robert

> What I did myself, was to split my large files into smaller chunks.
> 
> I use GNU 'split' to split it into 50MB chunks, then I use parity
> archives ('par' utility) to generate redundancy data for my split
> volumes.
> 
> This way I can recover the large file even if data errors on the disc
> prevent me from reading a few of the split volumes.
> 
> The par program also makes it easy to verify the integrity of all of the
> split volumes with one command.
> 
> If they're all OK, I can just use gnu 'cat' to concatenate all the files
> to disk.
> 
> I also include the md5sum of the large file so I can verify it is OK
> after concatenating the small files.
> 
> What I can't do, is repair/recover bad volumes without copying all the 
> split volumes to disk first.
> 
> On Fri, Jun 13, 2003 at 04:17:57PM -0400, Allan Peda wrote:
> > While this is not a mkisofs related topic, it does relate to dvdrecord.
> > I see so many problems, and few solutions on this list that I decided to
> > post a solution, of sorts.
> > 
> > I was trying to save multi gigabyte database dumps to dvd, but
> > limitations withing mkisofs (more specifically the joliet file system)
> > prevented me from storing these fat files in an iso image.
> > 
> > The workaround is to save the files to a tar file directly, and save
> > this to the DVD, sans any file system. While this does not result in a
> > DVD that can be mounted in the normal fashion, it does result in a DVD
> > that can be treated as a tape, and used for backups.  The big win of
> > course is that there is no 32 bit floating integer limit on the file
> > size.
> > 
> > To illustrate:
> > 
> > 0.) md5sum the files of interest:
> > md5sum /data/multiGigFile.dump | tee /data/multiGigFile.md5sum
> > 
> > 1.) Create backup tar file: 
> > tar cf /data/multiGigFile.tar /data/multiGigFile.dump
> > 
> > 2.) Shoot this onto the DVD using dvdrecord:
> > dvdrecord -v -pad -dao dev=1,0,0 speed=4 /data/multiGigFile.tar
> > 
> > 3.) Extract it again using dd.  It will be bigger due to padding.
> > dd if=/dev/cdrom of=/data/multiGigFile_restored.tar
> > 
> > 4.) Now compare the contents, by changing to another directory, 
> > and using tar xvf /data/multiGigFile_restored.tar
> > and then md5sum of the contents of the restored file.  If the sums
> > compare then the files should be identical (or try cmp - diff griped
> > about memory being exhausted) 
> > 
> > So far everything has been good.
> > 
> > I'm sure this could be streamlined with pipes, but I have the disk
> > space, and am relatively short of RAM, so I'm leaving the files around
> > for now.  As we've seen before, it's best to compress component files
> > _before_ placing them in the archive.  I save the uncompressed md5sum
> > file in the archive as well.  
> > 
> > The bzip2 man pages seem to imply that it has some sort of error
> > detection, that I have not read about for gzip, so perhaps it's better
> > for big files for that reason.  
> > 
> > 
> > 
> > -- 
> > Allan Peda
> > 
> > Programmer, Gene Array Resource Center
> > Rockefeller University
> > Box 203
> > 1230 York Ave
> > New York, NY 10021-6399
> > 
> > (tel) 212-327-7064
> > (fax) 212-327-7065
> > 
> > 
> > 
> > _______________________________________________
> > Dvdrtools-users mailing list
> > address@hidden
> > http://mail.nongnu.org/mailman/listinfo/dvdrtools-users
> > 
> 
> 

-- 
Robert M. Stockmann - RHCE
Network Engineer - UNIX/Linux Specialist
crashrecovery.org  address@hidden





reply via email to

[Prev in Thread] Current Thread [Next in Thread]