help-gnunet
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Help-gnunet] Is my reputation hurting me?


From: Igor Wronsky
Subject: Re: [Help-gnunet] Is my reputation hurting me?
Date: Sat, 12 Jul 2003 15:53:01 +0300 (EEST)

On Fri, 11 Jul 2003, Lance Simmons wrote:

> For the first week or so I had a gnunet node running, I was able to
> retrieve files pretty easily.  Since then, though, things have really
> slowed down.  Now it seems that gnunet-download can run for days at a
> time without retrieving anything, no matter what file I'm requesting.

Does this also go for files that you've already downloaded or
that have been inserted locally?

> Is my reputation hurting me?

As it can never be less than 0 (the start value), that shouldn't
be the issue here as it seems unlikely gnunet traffic would
have increased drastically in last few weeks... (any second
opinions?)

> The output of gnunet-stats worries me --
> I'm dropping lots of packets.

Thats quite normal. The amount you can send packets out
naturally depends on the bandwidth limit you've set. I'm
running a 3k/s upstream limited node and my sent/dropped
ratio is about 0.23 ~= 1/4. Still it seems that I'm able
to download and upload files to/from others, though
only *very* slowly. Also search results may take really
long to arrive.

> Here's my current output --

There doesn't seem anything suspicious (to me) in the
statistics.

I tried to access a four or five mp3s today that were
displayed in search results. I couldn't get a single block
of them (this is easily seen by gnunet-gtk. If ActiveReq
stays at 1, it hasn't received anything, not even
any metadata related to the file!). I also tried to
download a certainly existing, but non-local file, and
succeeded.

The problem here is central to the search result
migration routines which I have previously argued
against (though it seems I haven't been able to
propose any acceptable solution). The thing is that
due to the current mechanisms, even if you see a file
as a search result, there is no guarantee - not even
at that very moment - that any single byte of that file
were available in the network. In my opinion the current
mechanism will in the long run make all reasonable
search keys being saturated with files that are
not available anywhere.

The namespace/directory scheme under development will
indirectly and somewhat ease this problem by allowing
authors to create up-to-date filelists, but it is not
enough to solve the search problem. Some content availability
measuring or voting would be required. A simpler, but
quite brutal method would be to refuse to store any
search result to disk if the related file was not
either locally inserted or downloaded fully. In both cases
the server should also remove the search result block
once the file becomes locally incomplete due to reason
or another.

Shortly: the search sucks. And unless changed, it will
suck more as time passes. :( Its utopistic to think that
nodes would stay and support the content they've inserted
forever, or to think that most content would be so popular
that it'd stay online without the inserting node constantly
having it - but forever is the practically the duration a
search result currently stays available in gnunet; the
reason is that a common keyword is always more popular
than any content its pointing to.

Sorry for the technicalities. ;)


Igor





reply via email to

[Prev in Thread] Current Thread [Next in Thread]