lzip-bug
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Lzip-bug] chunking the files?


From: John Reiser
Subject: Re: [Lzip-bug] chunking the files?
Date: Sat, 08 Dec 2012 08:25:53 -0800
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:17.0) Gecko/17.0 Thunderbird/17.0

> I was wondering if you've considered an option for making file chunks, i.e., 
> if I have a 500 GB file that needs to go over the internet I can choose 1 GB 
> chunks (1 GB after compression) - while I do use an ftp with the ability to 
> keep going after interruption, it is still helpful to have breakpoints
> to restart from in case of failures it can't recover from.
> 

Apply "dd bs=1M count=2048 skip=..." or perhaps "split --bytes=2G"
to create the chunks before applying lzip.  Do this in batches of
10 chunks at a time to reduce the usage of temporary space.

Years of experience with zlib shows that even using blocks
as small as 1MiB costs about 1% or less.  Only colossal jackpots
("deliberate" repetitions where length >= 10,000) would make this
not true, and if you have many of those then you should use some
technique based on the original pieces instead of "blind" compression
on the concatenation.

-- 






reply via email to

[Prev in Thread] Current Thread [Next in Thread]