[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Duplicity-talk] Duplicity feature suggestions
From: |
Jay Summet |
Subject: |
Re: [Duplicity-talk] Duplicity feature suggestions |
Date: |
Sun, 12 Nov 2006 10:05:19 -0500 |
User-agent: |
Thunderbird 1.5.0.7 (X11/20060915) |
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Here is what I ended up doing after deciding that duplicity wouldn't quite work
for me.
1. I created a local mount for a remote (untrusted) filesystem using sshfs.
2. I created an encrypted filesystem using encfs on top of the remote FS mount.
[both sshfs and encfs are tools built using FUSE]
3. I'm using rsync to create my (encrypted, remote) backup.
I'm backing up 17GB over a 20kbps uplink, so it's been a week or so and I've
re-started several times, but it appears to be slowly building up a remote copy.
I may then attempt to do a cp -a to create a versioned hard linked backup. (such
as done locally by MEEP, which is a great (but non-encrypting) backup tool for
local backups).
One advantage of not using TAR is that I can pick and choose individual files,
and browse my backup as a normal remote filesystem.
Jay
Jan Rychter wrote:
> Kenneth Loafman:
>> Jan Rychter wrote:
>>> [reposting directly to the list, as it seems postings sent via Gmane
>>> disappear into a black hole somewhere...]
>>>
>>> I've been using duplicity for a while now, having first tried it about a
>>> year ago and having recently come back to it. Here is a brief wishlist
>>> of features that would make it work much better for me. Hopefully
>>> someone will find this useful.
>>>
>>> -- Store local copies of manifests and signature files. These files do
>>> not change and their size still makes it acceptable to cache them
>>> locally. It would speed the backup process immensely if I could tell
>>> duplicity to just keep them stored locally as well, and only download
>>> from the server when the local copies are missing.
>>>
>>> -- Provide some way to recover from an aborted first backup, or provide
>>> a way to do this first backup in stages. I have a 50GB filesystem I
>>> need to backup and 50GB of space on another continent where I'd like
>>> this data to land. I can't use duplicity for that. There is simply no
>>> way to do the first backup without it being interrupted by something
>>> -- a network glitch, usually. As it stands now, I simply cannot
>>> backup that data.
>>>
>>> Other than that duplicity works very well for me (for smaller backups)!
>>>
>>> --J.
>> I stage large backups to a local system, then have that system rsync
>> the local to the remote. That will give you a local copy for fast
>> restore and a remote backup for site disasters.
>>
>> Disk space is cheap enough that you could keep a 50GB weekly backup and
>> daily incrementals on a single drive.
>
> Sorry, that's not an acceptable workaround. What if I have 200GB to
> backup and no way to plug in another drive locally?
>
> As a side note, I was actually amazed at the scarcity of remote backup
> solutions for Linux. There is a huge rsync crowd that when asked about
> encryption shouts "you don't need encryption! you should have trusted
> remote backup space!". There is another crowd that says "you don't need
> remote backup at all, just plug in more local drives". Duplicity stands
> out as one of the best solutions out there -- but it isn't quite there
> yet.
>
> --J.
>
>
> _______________________________________________
> Duplicity-talk mailing list
> address@hidden
> http://lists.nongnu.org/mailman/listinfo/duplicity-talk
>
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.5 (GNU/Linux)
iQCVAwUBRVc31bWkkhmZq4xxAQIRtQP/b0zXuP45SQj6BPkmzjtfSkju/LuP5ows
nHN4W0rGEDXYEft9FfO4aCpXzXBf4GrX6UDQultprcCEHBQTCUW9KXtnfHEQ+ueK
zsAvASkPD1nEZzS1AiK3LYS1h7CcsmW8WHyuV7QgikSuGVLs9ELlYI3yMe1aJeME
2Hyl9glVe+A=
=lk3g
-----END PGP SIGNATURE-----