It comes down to how much pain you are willing to deal with if you should ever have to do a full restore, since a restore will require reading the full backup plus every incremental done since. The d
Hi, I'm a long time rsync user which has just started using duplicity for encrypted remote backups to Google Cloud. There are a few questions for which I couldn't get an answer in the docs, or maybe
a quick search with google came up with http://stackoverflow.com/questions/6832248/paramiko-no-existing-session-exception sounds similar.. ede/duply.net
saw your mail only just now. sorry. thanks for the detailed explanation! i wonder if the correct solution wouldn't be to add mime-types to a list and compare against this or abandon testing type allt
Hi, I recently discovered Duplicity and I must say it's pretty awesome. Right now the best choice for backing up servers to "the Cloud". I am trying out the Google Cloud Storage target (using a DRA b
There was a problem with file permissions. Duplicity was installed inside /usr/local/lib/python2.7/dist-packages/duplicity by default in my computer and I realized that root user could run it but nor
You're right, it's Python 2.7.5. I wouldn't have expected that to matter because the error I'm seeing is related to wildcard certificate matching: Failed to create bucket (attempt #1) 'test.foo.com'
That's correct. s3-eu-west-1.amazonaws.com/tango.bitcetera.com/ ? Exactly the same error occurs with the above URL (tried with and without the trailing slash). ______________________________________
That's interesting. It looks like maybe your s3+http URI got translated into an incorrect s3:// URI. I take it your bucket name is "tango.bitcetera.com"? What happens if you use the URL s3:// s3-eu-w
Hi, I'm back! I've started by cleaning out the easy stuff, applying patches and merging in code that others supplied. It is tested, as far as the test cases will take it, but be careful. Most o
boto v2 added support for Google Cloud Storage by creating a storage abstraction layer. I took advantage of that to add support for Google Cloud Storage to duplicity. Merging this patch would mea
boto v2 added support for Google Cloud Storage by creating a storage abstraction layer. I took advantage of that to add support for Google Cloud Storage to duplicity. Merging this patch would mea
Hi, OK, I sorted out my mailman problems :) As far as I understant, __fetch_entries is used to list directory and files that match a 'filename' pattern. It sends a request to google servers with a f
not familiar with the backend code. but if you could explain a bit more why it failed and why it works now, what the code did before vs. now, i could setup a launchpad branch for the maintainer to me
Hello everyone, First, let me thank you for a wonderful utility. I love duplicity ! I use duplicity 0.6.21. I have been playing with 'big' backups on google drive. I use the gdocs backend. The full b
[ Resent from subscribed e-mail address ] Hi, I have a server that gets backed up to a Google Drive account via the gdocs duplicity back-end. I do a full backup once a month and incremental updates e
looks healthy to me not sure about different block sizes. Ken or Mike maybe can comment on that. posting here is the second best approach to get changes merged. register with launchpad and adding a b
Hi everyone. I was trying to use duplicity to perform a full system backup of our system here (~7TB of data to be backed up) on our EMC ATMOS "local cloud" storage. The first problem I encountered if
this is from the go (a google programming language) sources http://code.google.com/p/go/source/browse/src/pkg/crypto/x509/verify.go#128 assuming this is correct, then it's a new level of professional