duplicity-talk
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Duplicity-talk] Small but important patch for Amazon S3 support


From: Mitchell Garnaat
Subject: [Duplicity-talk] Small but important patch for Amazon S3 support
Date: Sun, 23 Sep 2007 07:51:11 -0400

Hi -

I've begun using Duplicity for some of my backups.  It's great!  I'm using the Amazon S3 backend exclusively but I found a small problem in the list method.  The current version is calling the "get_all_keys" method of the S3 bucket object to iterate over all of the key ( e.g. objects/files) stored in the S3 bucket.  The problem is, this method will return only the first 1000 keys in the bucket.  To get the subsequent keys you need to keep calling S3 until it tells you it is out of results.  Fortunately, boto already provides a simple way to do this;  just treat the bucket itself as a sequence object and iterate over the keys.  Boto will handle all of the paging of results from S3.

This is a one-line change in backends.py:

697c697
<               filename_list = [k.key for k in self.bucket]
---
>               filename_list = [k.key for k in self.bucket.get_all_keys()]

Mitch

reply via email to

[Prev in Thread] Current Thread [Next in Thread]