dvdrtools-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Dvdrtools-users] 2GB file limit workaround


From: Volker Kuhlmann
Subject: Re: [Dvdrtools-users] 2GB file limit workaround
Date: Sat, 14 Jun 2003 13:47:57 +1200

Thanks for posting that! I was wondering what other people do.

> I was trying to save multi gigabyte database dumps to dvd, but
> limitations withing mkisofs (more specifically the joliet file system)

Are you sure? Joliet is an optional extra, on unix it's just a waste of
space.

> The workaround is to save the files to a tar file directly, and save
> this to the DVD, sans any file system.

The DVD is just another block device, whatever filesystem can be put on
a harddisk can also go on DVD. Ext2 works well, and the disk space
requirements are the same as for mkisofs. Obviously you can't store
files larger than a DVD, but you can with your method - interesting
idea.

If you only want to store files >2GB <DVDsize, use an ext2 filesystem
(or udf, with mkudffs), and the disk can be mounted in the usual
fashion (hint: loop mount). Whether it can be read on Microshite is
none of my concern, if you use DVDs as tar tapes it sounds like it's
none of yours either.

I always(!!) create md5sums of all files first and store them with the
data, just before mastering, and the master img isn't deleted until the
burnt disk verifies ok. It's trivial to do, and allows bad media
detection later (though not correction). I also always store an md5sum
of the burnt img elsewhere, this allows a quick verify of the entire
disk. All scripted of course, called md5 in
http://volker.dnsalias.net/soft/scriptutils.tar.gz (rpms there too).

> md5sum /data/multiGigFile.dump | tee /data/multiGigFile.md5sum
> tar cf /data/multiGigFile.tar /data/multiGigFile.dump

Unless you have specific reason to have a tar format on disk, you could
just skip making the tar file. If you're dealing with a file larger
than a DVD, the raw method is to use split (and copious quantities of
md5sum and disk space).

> 3.) Extract it again using dd.  It will be bigger due to padding.
> dd if=/dev/cdrom of=/data/multiGigFile_restored.tar

Am I the only one who finds that Linux is now too crappy to read
CDs/DVDs? Most of the time I get an error with that. Kernel 2.4.20.

cat /dev/cdrom is identical to your dd. I have found in the past that
some kernels panic at EOM (end of media recording). I have also found
that, until recently, it returns too much data (some of the zeroes added
by cdrecord -pad), and that on CDs written without -pad (e.g. all of the
commercial ones) it craps out with an I/O error even before it has read
all of the blocks belonging to the isofs. On a current 2.4.20, it
craps out well before the end of the isofs at least half of the time.
THIS SUCKS. It's independent of filesystem, happens with ext2 as well.
Same for CD + DVD. The only fix is to append 2Mbyte worth of zeroes to
whatever gets burnt. cdrecord -pad or -nopad is irrelevant (-pad only
adds 15k).

> compare then the files should be identical (or try cmp - diff griped
> about memory being exhausted) 

md5sum is twice as fast as cmp, as cmp needs to read both sets of data,
md5sum only one. Compared with the time it takes only one set of data,
the time of computing the md5 is insignificant.

diff is no good as it is line-based, it will need to read a full line
into memory first before it can compare. Depending on the data, this can
kill you before the line is finished.

Volker

-- 
Volker Kuhlmann                 is possibly list0570 with the domain in header
http://volker.dnsalias.net/             Please do not CC list postings to me.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]