[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Savannah-hackers-public] Re: New savannah site
From: |
Sylvain Beucler |
Subject: |
[Savannah-hackers-public] Re: New savannah site |
Date: |
Tue, 22 Dec 2009 01:47:08 +0100 |
User-agent: |
Mutt/1.5.20 (2009-06-14) |
On Mon, Dec 21, 2009 at 09:13:30AM +0100, Henrik Sandklef wrote:
> >For VCSes, I'm not sure that Savane is the place to add this.
>
> Problem as I see it is that for example CVS does not provide an RSS
> feed as the newer VCS:s (bazaar, git..) do.
Technicaly I think the RSS "feeds" are expensively generated
on-the-fly by the web-based repository browsers.
> So in order to have RSS
> feeds from CVS we need to either:
>
> 1. Constantly scrape the repo (from external computer) to find out
> if any new stuff was added
>
> 2. Add some kind of software to do this on the CVS 'server' (read savannah)
>
> Comments on (1):
> + clean CVS set up at savannah
> - this may lead to too many "cvs update" from external site
Yes
> Comments on (2):
>
No comments? :)
> >Btw, I'd like to know if you did any work on the privacy matters that
> >are related to scrapping.
>
> >AFAIU, you're working in this field as part of your thesis, and other
> >people/companies in the world also work on this. This allows to get
> >data on projects, which I think is fine, but also on individuals,
> >which I think is a problem. For example one is easily able to compute
> >the average work hours (and more generally work habits) of a specific
> >developer.
>
> At least the work hours committing code. This only (and
> unfortunately) counts for a small part of the work of an engineer.
>
> >Previously I felt somehow protected by 1) the amont of noise around
> >the traces I produce, making it hard to gather them and 2) the fact
> >that digging such traces and showing them off would amount to voyerism
> >and would be (dis)considered as such. With the development of
> >scrapping technologies, these protections are destroyed, so is there
> >any progress on re-improving privacy?
>
> I think you're very right about (1). Given that there are sites
> doing stat digging already I think that your suggestion is good and
> valid.
>
> One could argue and say that this should be up to every developer to
> solve. As an example I could commit to a secret repo and do commits
> from that repo to the real (public) repo.
It would be bad if developers decided to switch back en masse to
private development, just because anything public is harvested for all
kind of misuses. Somebody (and not each developer - e.g. rather the
VCS developers) has to come up with finer solutions.
> I think that there is also a risk of a change in commit behaviour to
> please the VCS stat sw. (BTW, my commits on Xnee last night may give
> many points since I made soo many stupid small errors rendering in
> tons of small commits....)
I think this already happened. SF's "top 10 most active" project list
used to rely on commit numbers, which clearly benefit
Xnee-last-night-style commit :)
> >E.g. (wild idea) one could use a git frontend that would reset all the
> >commit hours to 'this morning, midnight', which wouldn't affect the
> >stats, but avoid leaking privacy info.
>
> Interesting. And one could also anonymoise the commits.
This may be going too far - tracking who commits what is an important
information. Though in bzr and git, this information is very easy to
manipulate without admin access on the server.
--
Sylvain
signature.asc
Description: Digital signature