|
From: | Yavor Doganov |
Subject: | Re: [Savannah-hackers-public] robots.txt disallows all spiders for mailing lists |
Date: | Sun, 05 Jul 2009 21:59:14 +0300 |
User-agent: | Wanderlust/2.15.5 (Almost Unreal) SEMI/1.14.6 (Maruoka) FLIM/1.14.9 (Gojō) APEL/10.7 Emacs/22.3 (i486-pc-linux-gnu) MULE/5.0 (SAKAKI) |
Noah Slater wrote: > User-agent: * > Disallow: / > > The effect is that no mailing lists are spidered by Google, This is a deliberate decision by the GNU sysadmins, although I can't tell right now the reasons behind it (and I don't care, actually -- not being indexed by proprietary search engines does not seem to me like a problem that needs fixing). Maybe Sylvain or Karl can recall.
[Prev in Thread] | Current Thread | [Next in Thread] |