[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Dvdrtools-users] 2GB file limit workaround
From: |
André Dalle |
Subject: |
Re: [Dvdrtools-users] 2GB file limit workaround |
Date: |
Fri, 13 Jun 2003 18:19:56 -0400 |
User-agent: |
Mutt/1.4.1i |
What I did myself, was to split my large files into smaller chunks.
I use GNU 'split' to split it into 50MB chunks, then I use parity
archives ('par' utility) to generate redundancy data for my split
volumes.
This way I can recover the large file even if data errors on the disc
prevent me from reading a few of the split volumes.
The par program also makes it easy to verify the integrity of all of the
split volumes with one command.
If they're all OK, I can just use gnu 'cat' to concatenate all the files
to disk.
I also include the md5sum of the large file so I can verify it is OK
after concatenating the small files.
What I can't do, is repair/recover bad volumes without copying all the
split volumes to disk first.
On Fri, Jun 13, 2003 at 04:17:57PM -0400, Allan Peda wrote:
> While this is not a mkisofs related topic, it does relate to dvdrecord.
> I see so many problems, and few solutions on this list that I decided to
> post a solution, of sorts.
>
> I was trying to save multi gigabyte database dumps to dvd, but
> limitations withing mkisofs (more specifically the joliet file system)
> prevented me from storing these fat files in an iso image.
>
> The workaround is to save the files to a tar file directly, and save
> this to the DVD, sans any file system. While this does not result in a
> DVD that can be mounted in the normal fashion, it does result in a DVD
> that can be treated as a tape, and used for backups. The big win of
> course is that there is no 32 bit floating integer limit on the file
> size.
>
> To illustrate:
>
> 0.) md5sum the files of interest:
> md5sum /data/multiGigFile.dump | tee /data/multiGigFile.md5sum
>
> 1.) Create backup tar file:
> tar cf /data/multiGigFile.tar /data/multiGigFile.dump
>
> 2.) Shoot this onto the DVD using dvdrecord:
> dvdrecord -v -pad -dao dev=1,0,0 speed=4 /data/multiGigFile.tar
>
> 3.) Extract it again using dd. It will be bigger due to padding.
> dd if=/dev/cdrom of=/data/multiGigFile_restored.tar
>
> 4.) Now compare the contents, by changing to another directory,
> and using tar xvf /data/multiGigFile_restored.tar
> and then md5sum of the contents of the restored file. If the sums
> compare then the files should be identical (or try cmp - diff griped
> about memory being exhausted)
>
> So far everything has been good.
>
> I'm sure this could be streamlined with pipes, but I have the disk
> space, and am relatively short of RAM, so I'm leaving the files around
> for now. As we've seen before, it's best to compress component files
> _before_ placing them in the archive. I save the uncompressed md5sum
> file in the archive as well.
>
> The bzip2 man pages seem to imply that it has some sort of error
> detection, that I have not read about for gzip, so perhaps it's better
> for big files for that reason.
>
>
>
> --
> Allan Peda
>
> Programmer, Gene Array Resource Center
> Rockefeller University
> Box 203
> 1230 York Ave
> New York, NY 10021-6399
>
> (tel) 212-327-7064
> (fax) 212-327-7065
>
>
>
> _______________________________________________
> Dvdrtools-users mailing list
> address@hidden
> http://mail.nongnu.org/mailman/listinfo/dvdrtools-users
>
--
Andre Dalle address@hidden
Space Monkey