duplicity-talk
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Duplicity-talk] proposal: backup space usage distribution analysis


From: Kenneth Loafman
Subject: Re: [Duplicity-talk] proposal: backup space usage distribution analysis
Date: Wed, 12 Dec 2007 06:48:34 -0600
User-agent: Thunderbird 1.5.0.14pre (X11/20071023)

Andreas Schildbach wrote:
> Gabriel Ambuehl wrote:
> 
>> How could duplicity know what size it will take before actually trying
>> to backup a file?
> 
> Actually I had in mind an analysis of an already existing backup rather
> than an estimation of a future backup.

I think part of that analysis would be compression factor.  For some
directories that are mostly text, that would be high, for others low,
each depending on content.

The 'list-current-files' will tell you what the total collection looks
like, but I'm guessing something like this for every file would be handy
(file.foo grew by 200 real bytes, 40 compressed bytes, and is backed up
on <full> and <inc 2> plus whatever else we can gather):

                                             raw    comp
file                 orig  comp  last      delta   delta
/path/name/file.foo  50000 22200 12/10/07      0       0  <full>
                     50200 22240 12/12/07    200      40  <inc 2>

plus, at the end of a directory, some global change factors.  The file
by file list could be an optional feature with the directory list being
the default output.

...Ken


Attachment: signature.asc
Description: OpenPGP digital signature


reply via email to

[Prev in Thread] Current Thread [Next in Thread]