[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Savannah-hackers-public] Please disallow www-commits in robots.txt
From: |
Karl Berry |
Subject: |
Re: [Savannah-hackers-public] Please disallow www-commits in robots.txt |
Date: |
Fri, 12 May 2023 15:10:46 -0600 |
> In sum, this private playground is something webmasters want and need,
> and search engines should have no business indexing it. Is it possible?
robots.txt will not stop ill-behaved robots from indexing. Although
Google and Duck Duck Go, to the best of my knowledge, do respect it, it
is no panacea. So I don't think robots.txt solves the problem.
Also, as Alfred said, it feels quite weird to me to try to restrict
public pages to "robots". If a page is readable by a random human member
of the public, philosophically it seems to me it should also be readable
by robots (resources permitting, which is not the issue here).
Thus, using a separate and private repo like www-fr as Therese suggested
sounds to me like the best solution, both technically and
philosophically. I'm sure it is possible somehow to restrict viewing the
www-fr web pages to www members. Should be much easier than with a
subdirectory (/staging) of a public repo, seems to me.
My $.02, FWIW ... -k
- Re: [Savannah-hackers-public] Please disallow www-commits in robots.txt, (continued)
Re: [Savannah-hackers-public] Please disallow www-commits in robots.txt, Bob Proulx, 2023/05/12
Re: [Savannah-hackers-public] Please disallow www-commits in robots.txt, Ian Kelling, 2023/05/12
Re: [Savannah-hackers-public] Please disallow www-commits in robots.txt, Dora Scilipoti, 2023/05/12
Re: [Savannah-hackers-public] Please disallow www-commits in robots.txt, Thérèse Godefroy, 2023/05/14