I also get duplicate files (same filename, same parent folder) in
Google Drive when using duplicity and have to remove them manually
before retrying. On large backups (thousands of volumes) this happens
quite often.
I think what is happening is that during a backup, duplicity checks
the size of a file on Google Drive after each upload, and sometimes
this call returns -1. (The original upload often appears to have
completed successfully on later inspection, so I think this is some
sort of lag issue in Google Drive). Duplicity thinks the upload didn't
work, so it re-uploads the file. On Google Drive, uploading a file
with the same name and parent folder doesn't overwrite the original
file, it creates a duplicate.
Maybe this could be fixed by asking the server to delete the original
upload (since duplicity believes it to be faulty) before reuploading?