sks-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Sks-devel] Request: Install an efficient robots.txt file


From: robots.txt fan
Subject: [Sks-devel] Request: Install an efficient robots.txt file
Date: Tue, 20 Jun 2017 04:35:42 -0400

Dear Sirs and Madams,

I would like to thank all of you for doing this. You are a necessary pillar to PGP and it is awesome that you are there to provide the infrastructure to host everyone's key.

Without attempting to diminish the previous sentence, I have a request to make to some of you.

Most of the SKS serve an efficient robots.txt that prevents everyone's un-deletable name and email showing up on search engines. However, there are some exceptions. I like to keep a low profile, but when searching for my name, for example on Google, a significant amount of results are from SKS pages, or to be more specific, these:

keyserver.nausch.org
pgp.net.nz
pgp.circl.lu
keyserver.rayservers.com
sks-keyservers.net
keyserver.mattrude.com (special case: blocks /pks, but not /search, a non-standard (?) directory)

I would like to ask the owners of these pages to take the time to install an efficient robots.txt file, for example something like this:

User-agent: *
Disallow: /pks/

To all others, I would like to ask you to take the time to check if your server serves an efficient robots.txt file, and if it does not, to please install one.

If there is any doubt that a robots.txt file is a good idea, I can elaborate on that.

Thank you for your time.

RTF

reply via email to

[Prev in Thread] Current Thread [Next in Thread]