duplicity-talk
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Duplicity-talk] Mirroring a Duplicity Repository from S3 to local


From: Peter Schuller
Subject: Re: [Duplicity-talk] Mirroring a Duplicity Repository from S3 to local storage
Date: Sun, 18 Apr 2010 09:59:59 +0200

> It seems that duplicity stores changes in new files with unique names,
> so I should be able to just pull down the files from S3 that don't yet
> exist locally (and delete files that don't exist on S3). Is that
> correct? If so, is there a good way to do that, or should I write a
> script myself? Anything I should be careful of with this setup?

s3sync is one way to do rsync style transfers from s3:

   http://s3sync.net/wiki

In terms of being careful, I suppose it depends on what you're trying
to accomplish. If you want to do this in case S3 fails, then deleting
local files that don't exist in S3 is dangerous unless you have a
reversible history on your local file system.

The prime thing to be aware of is probably S3's eventual consistency
model. If you delete files that are gone on S3 but exist locally,
without first ensuring that the files that replace them are in fact
there, you could end up with a broken archive under certain specific
conditions and delays in the S3 consistency model. On the other hand,
simply synching again should restore the missing files (assuming S3
has now reached consistency). (Again if the aim is to protect again S3
loosing data/being buggy, you'd need local history anyway as
previously stated, so for the purpose of this paragraph I'm assuming
that's not the case.)

-- 
/ Peter Schuller




reply via email to

[Prev in Thread] Current Thread [Next in Thread]