[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: affect of large numbers of tags on performance

From: Paul Sander
Subject: Re: affect of large numbers of tags on performance
Date: Wed, 26 Nov 2003 07:53:47 -0800

There's a small incremental cost per tag per file:  Each tag is represented
by a line in each RCS file.  This is typically less than 100 bytes but
it depends on the length of the tag and the depth of the branch to which the
tag is applied.  In terms of time, this amounts to probably a few
milliseconds per file per 10,000 tags to process in most cases.  The
performance penalty is greater for "cvs log" due to the overhead to display

I used a method similar to yours for 5 years on several projects, without
much complaint about performance.  I did find myself wishing to be able to
clean old tags just to reduce clutter, but there was no technical requirement
to do so.

--- Forwarded mail from address@hidden
We use an automated build tool that performs many builds each day. Each time
there is a successful build it applies an appropriate tag to our CVS
repository. This has been running for a number of months now and on many
files there are well over 100 tags - for files that don't change much, all
100 tags are on the same version number. Obviously this tag count is going
to simply increase over time.
Does anyone know if we should be concerned about this? How does such a large
number of tags affect performance (checkout, commit, tag, log etc)? How does
it affect the size of the repository in terms of disk space? Should we be
looking to implement a scheme to purge old tags?

--- End of forwarded message from address@hidden

reply via email to

[Prev in Thread] Current Thread [Next in Thread]