[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Duplicity-talk] cleanup process is 13GB and takes >16 hours at 100% CPU

From: Ian Chard
Subject: [Duplicity-talk] cleanup process is 13GB and takes >16 hours at 100% CPU
Date: Thu, 31 Jul 2014 15:13:11 +0100
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:31.0) Gecko/20100101 Thunderbird/31.0


I'm running a cleanup of a large target, and the process I get is 13GB
in size and uses 100% CPU for many hours.  So far it's been running for
over 16 hours, and apart from reading a signatures file every few hours
it's giving no indication of progress.  It's spinning at 100% on the
CPU, so this isn't the remote end being slow.  I'm having to run it on a
spare machine because it's using so much memory.

My command line is:

duplicity -v9 --archive-dir /data/duplicity-archive/ --gpg-options
--homedir=/data/gpg-home/ --encrypt-key xxxxxxxx --asynchronous-upload
--full-if-older-than 10D --allow-source-mismatch --num-retries 1 cleanup
--force cf+http://my.target.name

I'm using duplicity 0.6.24, python 2.7.3, and Debian Wheezy.

Is this a scalability problem of duplicity, or more likely to be a bug?

I've logged a Launchpad bug
(https://bugs.launchpad.net/duplicity/+bug/1350404), but I see there are
many bugs in 'New' status, so I thought I'd try the mailing list too.

Thanks for any help
- Ian

Ian Chard   <address@hidden>
mySociety systems administrator   http://www.mysociety.org/

reply via email to

[Prev in Thread] Current Thread [Next in Thread]