duplicity-talk
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Duplicity-talk] Remote S3 backups not the same size as source?


From: edgar . soldin
Subject: Re: [Duplicity-talk] Remote S3 backups not the same size as source?
Date: Mon, 23 Dec 2013 17:25:54 +0100
User-agent: Mozilla/5.0 (Windows NT 5.1; rv:24.0) Gecko/20100101 Thunderbird/24.2.0

run duply/duplicity with '-v9' and you'll see what it does. this should point 
out what is taking so long.

be aware that verify downloads "all" volumes form the backend, so have a look 
at your s3 bandwidth quota.

..ede/duply.net

On 22.12.2013 19:11, Brandon wrote:
> I have had “duply email verify” running for 45 minutes now with no response. 
> There is, like I said, 83GB locally and 38GB remote but I don’t understand 
> why it would not have responded by now. How long does “verify” normally take 
> to complete? 
> 
> Also, I have not excluded anything; I have the directory, /mnt/Data/Email , 
> set with absolutely no exclusions. 
> 
> Thanks,
> 
> Brandon
> 
> On Dec 22, 2013, at 7:26 AM, address@hidden wrote:
> 
>> On 22.12.2013 09:21, Brandon wrote:
>>> I am using “duply”, a simplified front-end program to duplicity. I am 
>>> backing up all contents of a directory, amongst others, /mnt/Data/Email, to 
>>> an Amazon S3 bucket. There are a total of 83 GB in files. However, when I 
>>> first ran the backup operation, it backed up approximately 37 GB and then 
>>> said it was complete. I figured something went wrong so I then ran the 
>>> backup operation (incremental) again and it backed up another 1.2 GB even 
>>> though I had not modified, removed or added any files to the directory to 
>>> be backed up. I thought something might have been interrupting it so I ran 
>>> it yet again for the third time but that and all subsequent backup runs 
>>> backed up nothing else (except the 104 byte accounting file or whatever). 
>>> It is not backing up any more than 38.2 GB of 83 GB of files. I have 
>>> verified and the number of files in the S3 directory, multiplied by 25 MB 
>>> (the size of each difftar spanning archive file), and indeed it has backed 
>>> up just 38.2 gigabytes.
>>>
>>> What gives here? Why is it not backing up all  83GB? Is there some kind of 
>>> compression going on (although it would still be more than 38.2 GB because 
>>> I have most of my content already compressed)? For whatever reason 
>>> duplicity does not seem to be backing everything up and this worries me.
>>>
>>> Also, another directory, /mnt/Data/Organized, is 5.1 GB (according to du 
>>> -hs) yet duply/duplicity says source is 4.94 GB. This is extremely close, 
>>> and may just be a matter of accounting and such, and may be correct. 
>>> However the /mnt/Data/Email is off by over 42 gigabytes!
>>>
>>> What can I do to remedy this?
>>>
>>
>> run your backup with maximum verbosity '-v9'. can be lot's of info though.
>>
>> run 'verify' to see differences to the local data.
>>
>> make sure you did not exclude anything erronously.
>>
>> ..ede/duply.net
>>
>> _______________________________________________
>> Duplicity-talk mailing list
>> address@hidden
>> https://lists.nongnu.org/mailman/listinfo/duplicity-talk
> 
> 
> _______________________________________________
> Duplicity-talk mailing list
> address@hidden
> https://lists.nongnu.org/mailman/listinfo/duplicity-talk
> 



reply via email to

[Prev in Thread] Current Thread [Next in Thread]