[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Duplicity-talk] Simple multi-generation backup
From: |
edgar . soldin |
Subject: |
Re: [Duplicity-talk] Simple multi-generation backup |
Date: |
Tue, 13 Dec 2022 15:46:56 +0100 |
User-agent: |
Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:102.0) Gecko/20100101 Thunderbird/102.5.0 |
On 13.12.2022 15:21, Felix Natter wrote:
hello Edgar,
thanks for your reply, answers are inline:
np Felix,
see below
On 08.12.22 13:37, edgar.soldin--- via Duplicity-talk wrote:
hey Felix,
On 08.12.2022 08:10, Felix Natter via Duplicity-talk wrote:
Dear duplicity community,
I would like to set up a _simple_ multi-generation backup using duplicity
to a cloud service:
- target $CLOUD/daily/: backup once per day, keep last X
- target $CLOUD/weekly/: backup once per week, keep last Y
- target $CLOUD/monthly/: backup once per month, keep last Z
My daily script (to be called by cron job) looks like this:
duplicity --encrypt-key KEY --full-if-older-than 5D /repos $CLOUD/daily/
duplicity --encrypt-key KEY remove-all-but-n-full 2 $CLOUD/daily/
And my weekly cron job would call this:
duplicity --encrypt-key KEY --full-if-older-than 3W /repos $CLOUD/weekly/
duplicity --encrypt-key KEY remove-all-but-n-full 2 $CLOUD/weekly/
and so on for the monthly backup.
Q1: Is that a valid/good way to do it?
kind of wasteful, but sure. why are your fulls not located at the time frame's
start?
Sorry I do not understand. I do think that it is "wasteful" recomputing
weeklies/monthlies
instead of moving some backups from daily to weekly or monthly ("rotation"),
yeah, the doubled backup creation and file space requirement. you could just do
weekly fulls with daily incrementals into a week specific target folder on the
backend
say you do per cron (eg. use date command to determine week of year `date
+%Y.%U`)
do daily.. duplicity /repos $CLOUD/$(date +%%Y.%%U)/
NOTE: cron needs escaped % characters. escape by adding one % in front of the
other ;)
that way you will end up with weekly fulls with daily incrementals.
don't forget to verify your backups regularly (to be safe).
to keep only older weeklies, monthlies that you want you may delete weekly
incrementals or just delete whole weekly folders as needed later.
but what do you mean by "fulls not located at the time frame's start"?
weeks start at Monday. months start with the first of month.
Is there a better way to do multi-generation backups with duplicity (or duply)?
probably. anyway, see my suggestion above. you need to make sure the date is
correct on the box all the time though ;)
Q2: I tried to make a daily backup. Then I added a file /repos/bla.txt, then
made a daily backup. I removed it and made another daily backup.
Then I tried to restore /bla.txt from the backup:
list the backup contents with action command 'list-current-files' . as you
deleted it, it will probably not be there. use '--time' to point to a time
where it was not deleted, it should be listed. from the man page
https://duplicity.us/stable/duplicity.1.html
"
list-current-files [--time <time>] <url>
Lists the files contained in the most current backup or backup at time. The
information will be extracted from the signature files, not the archive data
itself. Thus the whole archive does not have to be downloaded, but on the other
hand if the archive has been deleted or corrupted, this command will not detect
it.
"
duplicity restore --file-to-restore /bla.txt $CLOUD/daily /tmp/bla.txt
# (I tried -t 1D/2D/3D as well)
restore needs a _relative_ path to the backup root. from the man page
https://duplicity.us/stable/duplicity.1.html
"
restore [--file-to-restore <relpath>] [--time <time>] <url> <target_folder>
You can restore the full monty or selected folders/files from a specific time.
Use the relative path as it is printed by list-current-files. Usually not
needed as duplicity enters restore mode when it detects that the URL comes
before the local folder.
"
-> this was a typo, of course I used a relative path. Thanks for the hint with
list-current-files
and collection-status!
So the workflow for restoring something in the latest possible version is
roughly:
- use duplicity collection-status --file-changed=FILE and then choose the
appropriate backup time.
- if it is not found at all in $CLOUD/daily, repeat for $CLOUD/weekly and
$CLOUD/monthly
- do restore with the same time and path
sounds reasonable
only looks at the last full/incremental backup of the set in $CLOUD/daily.
But I need to be able to restore a file that was deleted a few days ago.
list your backup chains to find out which backups exist
"
collection-status [--file-changed <relpath>] [--show-changes-in-set <index>]
<url>
Summarize the status of the backup repository by printing the chains and sets
found, and the number of volumes in each.
The --file-changed option summarizes the changes to the file (in the most
recent backup chain). The --show-changes-in-set option summarizes all the file
changes in the index:th backup set (where index 0 means the latest set, 1 means
the next to latest, etc.).
"
from the man page https://duplicity.us/stable/duplicity.1.html
Thanks, very helpful!
I think I am misunderstanding things, could you please point me to good
documentation?
https://duplicity.us/stable/duplicity.1.html
This looks very good, I will make sure to read it completely.
Thanks and Best Regards,
same to you.. sunny regards ede/duply.net