duplicity-talk
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Duplicity-talk] Seeding a remote backup


From: Lluís Batlle i Rossell
Subject: Re: [Duplicity-talk] Seeding a remote backup
Date: Thu, 14 Jun 2012 17:09:47 +0200
User-agent: Mutt/1.5.20 (2009-06-14)

On Thu, Jun 14, 2012 at 05:02:57PM +0200, address@hidden wrote:
> On 14.06.2012 16:55, Andrew Kohlsmith (mailing lists account) wrote:
> > On 2012-06-13, at 1:48 PM, address@hidden wrote:
> >> the current suggested workaround to the "do fulls regularly" is to simply 
> >> move incrementals (e.g. every two weeks) based on a full into a subfolder 
> >> and do new incrementals. this of course gets more expensive speaking 
> >> bandwidth as the changes compared to the full accumulate over time.
> > 
> > I've been running duplicity for a couple of years now. I create 650MB 
> > backup files and do incrementals for 13 weeks. I'm not sure how big the 
> > largest backup set is, but it does take forever to upload on a normal cable 
> > connection. :-/
> > 
> >> for restoring of a specific older time you will of course have to manually 
> >> move incrementals back and move the recent incrementals into another 
> >> subfolder and vice versa.
> > 
> > I thought in order to restore to a specific point-in-time you needed the 
> > full plus all incrementals from that full to the point you want to restore. 
> > I don't think it's possible to use a more recent full backup with older 
> > incrementals.
> > 
> 
> that's not what i suggested:
> 
> meant was do new incrementals against the old full to effectively shorten the 
> chain artificially hence minimizing the chance of having a defect volume 
> killing backups after it.
> 
> anyway, this is a workaround and no solution of course. also it does not 
> protect you from the full getting corrupted, so you additionally need that as 
> a copy in a safe place.

Depending on the backend, other solutions can work more effectively. Not all
backends are so dumb. For local files, you can compare checksums. For a remote
computer with shell access, also.

Other programs can do multilevel incremental backups. That is, full backup A,
then some incrementals B against A, and then incrementals C against B, creating
some kind of tree of incrementals.

Here we backup around 40GB, want backups for every day of the last month, and we
produce no more than 1MB every day. We have used duplicity before, but the
upload of full backups regularly is overkill. And our backend is not so dumb: we
backup locally, and then rsync.

Regards,
Lluís.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]