[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Duplicity-talk] how to tune duplicity for big files
From: |
Scott Hannahs |
Subject: |
Re: [Duplicity-talk] how to tune duplicity for big files |
Date: |
Fri, 27 Mar 2020 15:28:48 -0400 |
I would play with the block size before trying to make diffdir more efficient.
Possibly it could be an issue with GPG working hard and you might have to use a
lower bit key. It also may be a hash calculation on large files is causing the
system to run out of memory and then thrashing. In general duplicity runs
efficiently and has no problem with backing up even very old machines with slow
CPUs by modern standards.
-sth
> On Mar 27, 2020, at 2:27 PM, Jelle de Jong via Duplicity-talk
> <address@hidden> wrote:
>
> Hello everybody,
>
> My duplicity is low and runs on 100% cpu load for age. I am doig ba backup of
> large files from 1GB to 100GB encrypted.
>
> What can I do to speed this up?
>
> change code?
> /usr/lib/python2.7/dist-packages/duplicity/diffdir.py
>
> use options?
> --max-blocksize=number
>
> Kind regards,
>
> Jelle de Jong
>
> _______________________________________________
> Duplicity-talk mailing list
> address@hidden
> https://lists.nongnu.org/mailman/listinfo/duplicity-talk